Jan 27 11:20:19 crc systemd[1]: Starting Kubernetes Kubelet... Jan 27 11:20:19 crc restorecon[4692]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 11:20:19 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 11:20:20 crc restorecon[4692]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 27 11:20:20 crc restorecon[4692]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 27 11:20:21 crc kubenswrapper[4775]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 11:20:21 crc kubenswrapper[4775]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 27 11:20:21 crc kubenswrapper[4775]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 11:20:21 crc kubenswrapper[4775]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 11:20:21 crc kubenswrapper[4775]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 27 11:20:21 crc kubenswrapper[4775]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.434528 4775 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448609 4775 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448648 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448654 4775 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448660 4775 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448666 4775 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448672 4775 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448678 4775 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448683 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448689 4775 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448696 4775 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448702 4775 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448709 4775 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448717 4775 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448725 4775 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448730 4775 feature_gate.go:330] unrecognized feature gate: Example Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448737 4775 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448746 4775 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448756 4775 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448763 4775 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448772 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448779 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448786 4775 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448792 4775 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448799 4775 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448806 4775 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448813 4775 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448819 4775 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448825 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448831 4775 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448838 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448844 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448851 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448857 4775 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448864 4775 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448870 4775 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448879 4775 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448886 4775 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448892 4775 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448899 4775 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448906 4775 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448920 4775 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448931 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448938 4775 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448946 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448953 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448962 4775 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448970 4775 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448975 4775 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448980 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448985 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448991 4775 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.448996 4775 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449001 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449006 4775 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449011 4775 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449017 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449022 4775 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449028 4775 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449033 4775 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449038 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449043 4775 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449048 4775 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449053 4775 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449058 4775 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449063 4775 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449068 4775 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449073 4775 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449079 4775 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449084 4775 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449089 4775 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.449099 4775 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449249 4775 flags.go:64] FLAG: --address="0.0.0.0" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449274 4775 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449292 4775 flags.go:64] FLAG: --anonymous-auth="true" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449302 4775 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449314 4775 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449322 4775 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449334 4775 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449345 4775 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449353 4775 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449361 4775 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449368 4775 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449375 4775 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449383 4775 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449390 4775 flags.go:64] FLAG: --cgroup-root="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449397 4775 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449404 4775 flags.go:64] FLAG: --client-ca-file="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449410 4775 flags.go:64] FLAG: --cloud-config="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449417 4775 flags.go:64] FLAG: --cloud-provider="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449425 4775 flags.go:64] FLAG: --cluster-dns="[]" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449434 4775 flags.go:64] FLAG: --cluster-domain="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449441 4775 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449479 4775 flags.go:64] FLAG: --config-dir="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449487 4775 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449496 4775 flags.go:64] FLAG: --container-log-max-files="5" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449506 4775 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449515 4775 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449523 4775 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449531 4775 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449539 4775 flags.go:64] FLAG: --contention-profiling="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449547 4775 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449555 4775 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449564 4775 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449571 4775 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449582 4775 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449590 4775 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449598 4775 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449606 4775 flags.go:64] FLAG: --enable-load-reader="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449614 4775 flags.go:64] FLAG: --enable-server="true" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449623 4775 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449646 4775 flags.go:64] FLAG: --event-burst="100" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449656 4775 flags.go:64] FLAG: --event-qps="50" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449664 4775 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449672 4775 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449680 4775 flags.go:64] FLAG: --eviction-hard="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449691 4775 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449699 4775 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449707 4775 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449718 4775 flags.go:64] FLAG: --eviction-soft="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449727 4775 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449735 4775 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449743 4775 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449750 4775 flags.go:64] FLAG: --experimental-mounter-path="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449758 4775 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449766 4775 flags.go:64] FLAG: --fail-swap-on="true" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449773 4775 flags.go:64] FLAG: --feature-gates="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449781 4775 flags.go:64] FLAG: --file-check-frequency="20s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449790 4775 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449819 4775 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449826 4775 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449834 4775 flags.go:64] FLAG: --healthz-port="10248" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449840 4775 flags.go:64] FLAG: --help="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449847 4775 flags.go:64] FLAG: --hostname-override="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449853 4775 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449860 4775 flags.go:64] FLAG: --http-check-frequency="20s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449866 4775 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449872 4775 flags.go:64] FLAG: --image-credential-provider-config="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449878 4775 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449885 4775 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449891 4775 flags.go:64] FLAG: --image-service-endpoint="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449897 4775 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449906 4775 flags.go:64] FLAG: --kube-api-burst="100" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449913 4775 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449919 4775 flags.go:64] FLAG: --kube-api-qps="50" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449926 4775 flags.go:64] FLAG: --kube-reserved="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449932 4775 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449938 4775 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449945 4775 flags.go:64] FLAG: --kubelet-cgroups="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449951 4775 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449957 4775 flags.go:64] FLAG: --lock-file="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449964 4775 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449970 4775 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449977 4775 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449987 4775 flags.go:64] FLAG: --log-json-split-stream="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.449995 4775 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450001 4775 flags.go:64] FLAG: --log-text-split-stream="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450008 4775 flags.go:64] FLAG: --logging-format="text" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450014 4775 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450020 4775 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450026 4775 flags.go:64] FLAG: --manifest-url="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450033 4775 flags.go:64] FLAG: --manifest-url-header="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450042 4775 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450048 4775 flags.go:64] FLAG: --max-open-files="1000000" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450056 4775 flags.go:64] FLAG: --max-pods="110" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450062 4775 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450069 4775 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450075 4775 flags.go:64] FLAG: --memory-manager-policy="None" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450081 4775 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450088 4775 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450094 4775 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450100 4775 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450119 4775 flags.go:64] FLAG: --node-status-max-images="50" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450126 4775 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450133 4775 flags.go:64] FLAG: --oom-score-adj="-999" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450140 4775 flags.go:64] FLAG: --pod-cidr="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450146 4775 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450155 4775 flags.go:64] FLAG: --pod-manifest-path="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450161 4775 flags.go:64] FLAG: --pod-max-pids="-1" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450168 4775 flags.go:64] FLAG: --pods-per-core="0" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450174 4775 flags.go:64] FLAG: --port="10250" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450180 4775 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450187 4775 flags.go:64] FLAG: --provider-id="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450193 4775 flags.go:64] FLAG: --qos-reserved="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450199 4775 flags.go:64] FLAG: --read-only-port="10255" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450205 4775 flags.go:64] FLAG: --register-node="true" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450211 4775 flags.go:64] FLAG: --register-schedulable="true" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450217 4775 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450229 4775 flags.go:64] FLAG: --registry-burst="10" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450235 4775 flags.go:64] FLAG: --registry-qps="5" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450241 4775 flags.go:64] FLAG: --reserved-cpus="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450249 4775 flags.go:64] FLAG: --reserved-memory="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450258 4775 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450264 4775 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450271 4775 flags.go:64] FLAG: --rotate-certificates="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450277 4775 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450283 4775 flags.go:64] FLAG: --runonce="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450289 4775 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450295 4775 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450302 4775 flags.go:64] FLAG: --seccomp-default="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450308 4775 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450314 4775 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450320 4775 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450327 4775 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450333 4775 flags.go:64] FLAG: --storage-driver-password="root" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450340 4775 flags.go:64] FLAG: --storage-driver-secure="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450347 4775 flags.go:64] FLAG: --storage-driver-table="stats" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450353 4775 flags.go:64] FLAG: --storage-driver-user="root" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450359 4775 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450366 4775 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450373 4775 flags.go:64] FLAG: --system-cgroups="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450380 4775 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450394 4775 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450402 4775 flags.go:64] FLAG: --tls-cert-file="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450411 4775 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450427 4775 flags.go:64] FLAG: --tls-min-version="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450434 4775 flags.go:64] FLAG: --tls-private-key-file="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450440 4775 flags.go:64] FLAG: --topology-manager-policy="none" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450446 4775 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450476 4775 flags.go:64] FLAG: --topology-manager-scope="container" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450485 4775 flags.go:64] FLAG: --v="2" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450496 4775 flags.go:64] FLAG: --version="false" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450507 4775 flags.go:64] FLAG: --vmodule="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450517 4775 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.450527 4775 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450737 4775 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450751 4775 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450761 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450769 4775 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450776 4775 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450783 4775 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450790 4775 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450797 4775 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450805 4775 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450812 4775 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450819 4775 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450825 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450830 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450836 4775 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450842 4775 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450847 4775 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450852 4775 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450857 4775 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450863 4775 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450868 4775 feature_gate.go:330] unrecognized feature gate: Example Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450874 4775 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450879 4775 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450884 4775 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450889 4775 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450895 4775 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450900 4775 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450905 4775 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450911 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450916 4775 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450921 4775 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450926 4775 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450932 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450939 4775 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450945 4775 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450952 4775 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450959 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450965 4775 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450970 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450977 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450982 4775 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450987 4775 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450993 4775 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.450998 4775 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451003 4775 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451018 4775 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451023 4775 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451030 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451036 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451041 4775 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451047 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451052 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451058 4775 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451063 4775 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451069 4775 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451074 4775 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451079 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451084 4775 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451092 4775 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451097 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451103 4775 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451108 4775 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451113 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451118 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451125 4775 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451132 4775 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451138 4775 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451145 4775 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451152 4775 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451157 4775 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451163 4775 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.451168 4775 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.451189 4775 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.460757 4775 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.460786 4775 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460867 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460878 4775 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460887 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460893 4775 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460898 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460903 4775 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460907 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460912 4775 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460918 4775 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460922 4775 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460926 4775 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460931 4775 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460935 4775 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460938 4775 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460942 4775 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460946 4775 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460950 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460953 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460957 4775 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460960 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460964 4775 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460968 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460971 4775 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460974 4775 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460978 4775 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460983 4775 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460988 4775 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460992 4775 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460995 4775 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.460999 4775 feature_gate.go:330] unrecognized feature gate: Example Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461003 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461007 4775 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461011 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461015 4775 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461032 4775 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461036 4775 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461039 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461044 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461048 4775 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461052 4775 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461056 4775 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461059 4775 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461063 4775 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461066 4775 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461070 4775 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461073 4775 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461077 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461080 4775 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461084 4775 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461088 4775 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461092 4775 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461095 4775 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461099 4775 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461102 4775 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461106 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461109 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461113 4775 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461117 4775 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461122 4775 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461127 4775 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461131 4775 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461136 4775 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461140 4775 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461145 4775 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461149 4775 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461153 4775 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461156 4775 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461160 4775 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461163 4775 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461167 4775 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461172 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.461179 4775 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461321 4775 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461330 4775 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461334 4775 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461338 4775 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461342 4775 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461345 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461349 4775 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461352 4775 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461356 4775 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461360 4775 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461363 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461368 4775 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461374 4775 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461378 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461383 4775 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461386 4775 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461390 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461395 4775 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461400 4775 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461404 4775 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461408 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461413 4775 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461417 4775 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461422 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461426 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461430 4775 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461434 4775 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461438 4775 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461441 4775 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461460 4775 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461465 4775 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461470 4775 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461474 4775 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461478 4775 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461483 4775 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461487 4775 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461490 4775 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461494 4775 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461498 4775 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461502 4775 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461506 4775 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461510 4775 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461513 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461517 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461521 4775 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461524 4775 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461528 4775 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461533 4775 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461538 4775 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461543 4775 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461548 4775 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461552 4775 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461556 4775 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461560 4775 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461564 4775 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461568 4775 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461573 4775 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461577 4775 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461581 4775 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461585 4775 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461589 4775 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461593 4775 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461597 4775 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461601 4775 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461604 4775 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461608 4775 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461612 4775 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461616 4775 feature_gate.go:330] unrecognized feature gate: Example Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461620 4775 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461624 4775 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.461632 4775 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.461639 4775 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.461853 4775 server.go:940] "Client rotation is on, will bootstrap in background" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.465862 4775 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.465943 4775 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.467508 4775 server.go:997] "Starting client certificate rotation" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.467526 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.473998 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-26 12:05:45.79151047 +0000 UTC Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.474120 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.517264 4775 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 27 11:20:21 crc kubenswrapper[4775]: E0127 11:20:21.523907 4775 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.536703 4775 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.556478 4775 log.go:25] "Validated CRI v1 runtime API" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.596377 4775 log.go:25] "Validated CRI v1 image API" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.601943 4775 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.627978 4775 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-27-11-16-00-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.628015 4775 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.642786 4775 manager.go:217] Machine: {Timestamp:2026-01-27 11:20:21.639990829 +0000 UTC m=+0.781588626 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:574d97c2-3ebe-40ee-9434-ec47862a34d4 BootID:a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:1a:1e:d0 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:1a:1e:d0 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:d2:b2:83 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:f2:a8:f9 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:3a:21:20 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:50:c1:bb Speed:-1 Mtu:1496} {Name:eth10 MacAddress:9e:af:e3:3d:82:52 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:02:1b:8a:bc:7d:fc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.642997 4775 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.643128 4775 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.643423 4775 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.643728 4775 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.643766 4775 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.643994 4775 topology_manager.go:138] "Creating topology manager with none policy" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.644005 4775 container_manager_linux.go:303] "Creating device plugin manager" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.644525 4775 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.644559 4775 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.645233 4775 state_mem.go:36] "Initialized new in-memory state store" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.645337 4775 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.649058 4775 kubelet.go:418] "Attempting to sync node with API server" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.649079 4775 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.649106 4775 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.649121 4775 kubelet.go:324] "Adding apiserver pod source" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.649136 4775 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.655234 4775 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.657105 4775 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.658059 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:21 crc kubenswrapper[4775]: E0127 11:20:21.658191 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.658617 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.658759 4775 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 27 11:20:21 crc kubenswrapper[4775]: E0127 11:20:21.658747 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.666907 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.666967 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.666985 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.666999 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.667022 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.667058 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.667071 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.667095 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.667112 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.667126 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.667154 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.667168 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.668333 4775 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.669341 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.669424 4775 server.go:1280] "Started kubelet" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.671270 4775 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.671279 4775 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 27 11:20:21 crc systemd[1]: Started Kubernetes Kubelet. Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.676884 4775 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.677496 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.677612 4775 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.677676 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 02:46:58.205811465 +0000 UTC Jan 27 11:20:21 crc kubenswrapper[4775]: E0127 11:20:21.679482 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.684717 4775 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.684758 4775 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.685080 4775 factory.go:55] Registering systemd factory Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.685294 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:21 crc kubenswrapper[4775]: E0127 11:20:21.685364 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Jan 27 11:20:21 crc kubenswrapper[4775]: E0127 11:20:21.685104 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="200ms" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.685834 4775 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.686076 4775 factory.go:221] Registration of the systemd container factory successfully Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.686542 4775 factory.go:153] Registering CRI-O factory Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.686600 4775 factory.go:221] Registration of the crio container factory successfully Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.686739 4775 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.686786 4775 factory.go:103] Registering Raw factory Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.686825 4775 manager.go:1196] Started watching for new ooms in manager Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.687984 4775 manager.go:319] Starting recovery of all containers Jan 27 11:20:21 crc kubenswrapper[4775]: E0127 11:20:21.685318 4775 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188e92893919a9f3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 11:20:21.669341683 +0000 UTC m=+0.810939510,LastTimestamp:2026-01-27 11:20:21.669341683 +0000 UTC m=+0.810939510,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.688496 4775 server.go:460] "Adding debug handlers to kubelet server" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.698813 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.698871 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.698889 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.698902 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.698915 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.698929 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.698946 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.698966 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.698990 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699024 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699037 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699051 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699105 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699119 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699131 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699160 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699177 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699189 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699202 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699215 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699245 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699257 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699301 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699350 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699365 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699380 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699398 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699412 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699514 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699595 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699611 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699658 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699672 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699685 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699698 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699715 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699727 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699742 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699755 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699792 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699821 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699839 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699854 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699868 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699880 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699893 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.699979 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700031 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700045 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700061 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700072 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700109 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700210 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700225 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700240 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700330 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700349 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700406 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700420 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700433 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700559 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700570 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700580 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700616 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700667 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700687 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700699 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700714 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700724 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700736 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700771 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700806 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700819 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700831 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700842 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700853 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700865 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700874 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700883 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700916 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700925 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700934 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700943 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700953 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700982 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.700991 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701002 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701032 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701041 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701051 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701060 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701069 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701095 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701105 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701132 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701160 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701170 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701181 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701188 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701199 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701209 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701218 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701267 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701326 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701396 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701406 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701417 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701483 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701500 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701568 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701579 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701591 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701601 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701626 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701635 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701643 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701667 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701676 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701685 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701694 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701704 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701713 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701721 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701730 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701743 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701752 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701777 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701787 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701810 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701818 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701826 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701835 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701845 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701857 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701870 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701883 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701911 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701920 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701927 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701939 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701951 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701963 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701975 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.701990 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702004 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702020 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702031 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702041 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702051 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702061 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702071 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702081 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702092 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702105 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702117 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702130 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702140 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702150 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702161 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702172 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702183 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702195 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702206 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702217 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702228 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702238 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702249 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.702260 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.703957 4775 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.703986 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704002 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704013 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704027 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704041 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704053 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704067 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704080 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704094 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704110 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704125 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704137 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704151 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704165 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704179 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704193 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704207 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704223 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704239 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704272 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704286 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704302 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704323 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704368 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704384 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704398 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704412 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704426 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704470 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704486 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704501 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704517 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704529 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704545 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704557 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704570 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704583 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704596 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704609 4775 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704620 4775 reconstruct.go:97] "Volume reconstruction finished" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.704628 4775 reconciler.go:26] "Reconciler: start to sync state" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.717658 4775 manager.go:324] Recovery completed Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.730244 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.731415 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.731485 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.731496 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.732190 4775 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.732242 4775 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.732317 4775 state_mem.go:36] "Initialized new in-memory state store" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.741300 4775 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.743615 4775 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.743660 4775 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.743691 4775 kubelet.go:2335] "Starting kubelet main sync loop" Jan 27 11:20:21 crc kubenswrapper[4775]: E0127 11:20:21.743846 4775 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 27 11:20:21 crc kubenswrapper[4775]: W0127 11:20:21.744595 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:21 crc kubenswrapper[4775]: E0127 11:20:21.744720 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.760197 4775 policy_none.go:49] "None policy: Start" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.761827 4775 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.761858 4775 state_mem.go:35] "Initializing new in-memory state store" Jan 27 11:20:21 crc kubenswrapper[4775]: E0127 11:20:21.779841 4775 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.828596 4775 manager.go:334] "Starting Device Plugin manager" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.828651 4775 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.828665 4775 server.go:79] "Starting device plugin registration server" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.829171 4775 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.829494 4775 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.829815 4775 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.829908 4775 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.829917 4775 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 27 11:20:21 crc kubenswrapper[4775]: E0127 11:20:21.836788 4775 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.844997 4775 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.845093 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.846512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.846554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.846567 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.846775 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.847029 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.847094 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.847679 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.847703 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.847713 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.847803 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.847953 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.847988 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.848032 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.848056 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.848065 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.848629 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.848672 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.848687 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.848834 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.848940 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.848973 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.849293 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.849324 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.849336 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.849669 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.849694 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.849705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.849676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.849744 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.849755 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.849827 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.849968 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.850001 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.850489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.850521 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.850531 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.850710 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.850733 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.850755 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.850765 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.850738 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.851421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.851470 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.851483 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:21 crc kubenswrapper[4775]: E0127 11:20:21.886518 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="400ms" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.906734 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.906850 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.906887 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.906917 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.906955 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.907063 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.907114 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.907186 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.907239 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.907264 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.907283 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.907306 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.907386 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.907476 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.907534 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.929571 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.930781 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.930814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.930824 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:21 crc kubenswrapper[4775]: I0127 11:20:21.930843 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 11:20:21 crc kubenswrapper[4775]: E0127 11:20:21.931203 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008295 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008350 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008369 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008389 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008429 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008445 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008481 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008499 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008494 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008518 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008566 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008514 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008608 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008621 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008636 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008656 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008663 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008652 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008605 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008680 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008703 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008688 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008731 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008749 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008767 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008790 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008810 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008844 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008856 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.008946 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.132147 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.133512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.133553 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.133563 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.133593 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 11:20:22 crc kubenswrapper[4775]: E0127 11:20:22.134053 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.187167 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.193335 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.211047 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.229347 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.235185 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:22 crc kubenswrapper[4775]: E0127 11:20:22.287962 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="800ms" Jan 27 11:20:22 crc kubenswrapper[4775]: W0127 11:20:22.331569 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-1adf35527e12a87acccb733f5e83f3e640423b6e584b579931030bb6abbc9352 WatchSource:0}: Error finding container 1adf35527e12a87acccb733f5e83f3e640423b6e584b579931030bb6abbc9352: Status 404 returned error can't find the container with id 1adf35527e12a87acccb733f5e83f3e640423b6e584b579931030bb6abbc9352 Jan 27 11:20:22 crc kubenswrapper[4775]: W0127 11:20:22.331926 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-239f203798f26ab8d5d3423c8ff9dca908d2f90a3684ba48ecfe06d3553da543 WatchSource:0}: Error finding container 239f203798f26ab8d5d3423c8ff9dca908d2f90a3684ba48ecfe06d3553da543: Status 404 returned error can't find the container with id 239f203798f26ab8d5d3423c8ff9dca908d2f90a3684ba48ecfe06d3553da543 Jan 27 11:20:22 crc kubenswrapper[4775]: W0127 11:20:22.332246 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-847757953da141ee96133519a185ae845a53af6d835cca0b2735bcc8db02784f WatchSource:0}: Error finding container 847757953da141ee96133519a185ae845a53af6d835cca0b2735bcc8db02784f: Status 404 returned error can't find the container with id 847757953da141ee96133519a185ae845a53af6d835cca0b2735bcc8db02784f Jan 27 11:20:22 crc kubenswrapper[4775]: W0127 11:20:22.334233 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-a7a76a002d2ae86f702a5f8720160288dfcac4091e1a1b309b9b9f6c28511f10 WatchSource:0}: Error finding container a7a76a002d2ae86f702a5f8720160288dfcac4091e1a1b309b9b9f6c28511f10: Status 404 returned error can't find the container with id a7a76a002d2ae86f702a5f8720160288dfcac4091e1a1b309b9b9f6c28511f10 Jan 27 11:20:22 crc kubenswrapper[4775]: W0127 11:20:22.335178 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-3fbe11ab6a3617fcd9f86d014870ea02b1ad310df40b1b08d15d461f78c914dc WatchSource:0}: Error finding container 3fbe11ab6a3617fcd9f86d014870ea02b1ad310df40b1b08d15d461f78c914dc: Status 404 returned error can't find the container with id 3fbe11ab6a3617fcd9f86d014870ea02b1ad310df40b1b08d15d461f78c914dc Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.535125 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.536293 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.536351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.536369 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.536407 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 11:20:22 crc kubenswrapper[4775]: E0127 11:20:22.537060 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.670317 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.678485 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 19:31:11.040935928 +0000 UTC Jan 27 11:20:22 crc kubenswrapper[4775]: W0127 11:20:22.684214 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:22 crc kubenswrapper[4775]: E0127 11:20:22.684292 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.748030 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a7a76a002d2ae86f702a5f8720160288dfcac4091e1a1b309b9b9f6c28511f10"} Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.749221 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3fbe11ab6a3617fcd9f86d014870ea02b1ad310df40b1b08d15d461f78c914dc"} Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.750177 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"1adf35527e12a87acccb733f5e83f3e640423b6e584b579931030bb6abbc9352"} Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.751080 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"847757953da141ee96133519a185ae845a53af6d835cca0b2735bcc8db02784f"} Jan 27 11:20:22 crc kubenswrapper[4775]: I0127 11:20:22.755587 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"239f203798f26ab8d5d3423c8ff9dca908d2f90a3684ba48ecfe06d3553da543"} Jan 27 11:20:22 crc kubenswrapper[4775]: W0127 11:20:22.925988 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:22 crc kubenswrapper[4775]: E0127 11:20:22.926091 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Jan 27 11:20:22 crc kubenswrapper[4775]: W0127 11:20:22.935828 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:22 crc kubenswrapper[4775]: E0127 11:20:22.935869 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Jan 27 11:20:23 crc kubenswrapper[4775]: E0127 11:20:23.089508 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="1.6s" Jan 27 11:20:23 crc kubenswrapper[4775]: W0127 11:20:23.265774 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:23 crc kubenswrapper[4775]: E0127 11:20:23.265890 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Jan 27 11:20:23 crc kubenswrapper[4775]: I0127 11:20:23.337486 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:23 crc kubenswrapper[4775]: I0127 11:20:23.339214 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:23 crc kubenswrapper[4775]: I0127 11:20:23.339267 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:23 crc kubenswrapper[4775]: I0127 11:20:23.339281 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:23 crc kubenswrapper[4775]: I0127 11:20:23.339312 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 11:20:23 crc kubenswrapper[4775]: E0127 11:20:23.339749 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Jan 27 11:20:23 crc kubenswrapper[4775]: I0127 11:20:23.564225 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 27 11:20:23 crc kubenswrapper[4775]: E0127 11:20:23.565392 4775 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Jan 27 11:20:23 crc kubenswrapper[4775]: I0127 11:20:23.670615 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:23 crc kubenswrapper[4775]: I0127 11:20:23.678666 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 04:38:57.221937049 +0000 UTC Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.670685 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.679776 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 02:44:59.147615135 +0000 UTC Jan 27 11:20:24 crc kubenswrapper[4775]: E0127 11:20:24.691068 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="3.2s" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.762215 4775 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="58e703b38d9a62b363f9e9a551ab55f22f907822ffe7e1beb3820c1a8630e3be" exitCode=0 Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.762321 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"58e703b38d9a62b363f9e9a551ab55f22f907822ffe7e1beb3820c1a8630e3be"} Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.762399 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.763814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.763853 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.763865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.765109 4775 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="07f40e7bb26489a4c9421910cd611be18daa774dfe8333a1c35f205b36d42648" exitCode=0 Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.765168 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"07f40e7bb26489a4c9421910cd611be18daa774dfe8333a1c35f205b36d42648"} Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.765263 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.766963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.766995 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.767004 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.770440 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685" exitCode=0 Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.770549 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.770584 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685"} Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.771822 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.771863 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.771877 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.774743 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d"} Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.774805 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058"} Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.775625 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.776811 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.776865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.776884 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.778071 4775 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce" exitCode=0 Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.778129 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce"} Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.778149 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.778954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.778999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.779013 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.940093 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.946922 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.946962 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.946972 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:24 crc kubenswrapper[4775]: I0127 11:20:24.947003 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 11:20:24 crc kubenswrapper[4775]: E0127 11:20:24.947535 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Jan 27 11:20:25 crc kubenswrapper[4775]: W0127 11:20:25.246055 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:25 crc kubenswrapper[4775]: E0127 11:20:25.246150 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Jan 27 11:20:25 crc kubenswrapper[4775]: W0127 11:20:25.416555 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:25 crc kubenswrapper[4775]: E0127 11:20:25.416657 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.671122 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.680760 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 09:13:01.164948572 +0000 UTC Jan 27 11:20:25 crc kubenswrapper[4775]: W0127 11:20:25.739764 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:25 crc kubenswrapper[4775]: E0127 11:20:25.739862 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Jan 27 11:20:25 crc kubenswrapper[4775]: W0127 11:20:25.745856 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:25 crc kubenswrapper[4775]: E0127 11:20:25.745926 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.782208 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1"} Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.782268 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490"} Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.784665 4775 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e292e9377192358d3c39c87ae0503a4f6fa57867d7b3b17160706e93e4f96c6e" exitCode=0 Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.784696 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e292e9377192358d3c39c87ae0503a4f6fa57867d7b3b17160706e93e4f96c6e"} Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.784770 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.785666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.785751 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.785775 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.786682 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cf4b740a0fc72eeea73daa6a771713a6cb71112eaaae6cb91d4559f71568ad40"} Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.786707 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.787671 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.787718 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.787740 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.788944 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57"} Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.788975 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629"} Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.791402 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d"} Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.791429 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45"} Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.791471 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.791986 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.792014 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:25 crc kubenswrapper[4775]: I0127 11:20:25.792026 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.025318 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.027583 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.406592 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.406829 4775 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": dial tcp 192.168.126.11:10357: connect: connection refused" start-of-body= Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.406927 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": dial tcp 192.168.126.11:10357: connect: connection refused" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.670366 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.681539 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 02:10:03.160516812 +0000 UTC Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.797989 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736"} Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.798137 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.799487 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.799552 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.799578 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.801476 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b24cf3dc6d58d4da096459cb436952831c57cc5d5c014a1549dc2dc8e0f1e9e6"} Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.801485 4775 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b24cf3dc6d58d4da096459cb436952831c57cc5d5c014a1549dc2dc8e0f1e9e6" exitCode=0 Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.801643 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.802741 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.802784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.802802 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.807409 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"67a870569d400ca1948934b792b55f3145d18677c2ac71aa327602e4e18e182f"} Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.807487 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89"} Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.807499 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.807506 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699"} Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.807523 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.807565 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.808742 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.808787 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.808814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.808825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.808853 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.808868 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.809418 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.809487 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:26 crc kubenswrapper[4775]: I0127 11:20:26.809499 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:26 crc kubenswrapper[4775]: E0127 11:20:26.890658 4775 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188e92893919a9f3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 11:20:21.669341683 +0000 UTC m=+0.810939510,LastTimestamp:2026-01-27 11:20:21.669341683 +0000 UTC m=+0.810939510,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.570364 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 27 11:20:27 crc kubenswrapper[4775]: E0127 11:20:27.571846 4775 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.22:6443: connect: connection refused" logger="UnhandledError" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.671062 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.682074 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 07:18:25.173092112 +0000 UTC Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.753067 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.818403 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.819272 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6687109ebae71609761bb7674ad0fe39e13fe8bc2cb7b00864ba4ea8102c8a1c"} Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.819340 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0f842925d8575e8418e824ed06e90845663d4203b9b2daeb630adf6dc57d44a0"} Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.819439 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.819561 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.820230 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.820285 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.824695 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.824744 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.824765 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.824811 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.824851 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.824882 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.824897 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.824918 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:27 crc kubenswrapper[4775]: I0127 11:20:27.824902 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:27 crc kubenswrapper[4775]: E0127 11:20:27.892036 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="6.4s" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.148014 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.149096 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.149141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.149154 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.149184 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 11:20:28 crc kubenswrapper[4775]: E0127 11:20:28.149691 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.22:6443: connect: connection refused" node="crc" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.285665 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.665537 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.666051 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.666135 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.670835 4775 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.22:6443: connect: connection refused Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.683637 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 16:16:43.474653881 +0000 UTC Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.823600 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ffb69bb3e1f6f81b7881ee50aea2af93e70b41dcccebad03b27cfd4787fe32f5"} Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.823650 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a7374ffee4491b4696222baeca8ef5ee01466378c62586300f853725afbb1645"} Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.823663 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d384d74f9b2a90f26598016a124dc519c30ec4afb50a2a5a4ea857eafab9f145"} Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.823744 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.824797 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.824832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.824845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.825508 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.827772 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="67a870569d400ca1948934b792b55f3145d18677c2ac71aa327602e4e18e182f" exitCode=255 Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.827828 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"67a870569d400ca1948934b792b55f3145d18677c2ac71aa327602e4e18e182f"} Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.827860 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.827898 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.828824 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.828852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.828863 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.828895 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.828912 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.828922 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:28 crc kubenswrapper[4775]: I0127 11:20:28.829259 4775 scope.go:117] "RemoveContainer" containerID="67a870569d400ca1948934b792b55f3145d18677c2ac71aa327602e4e18e182f" Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.684344 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 14:56:50.47611871 +0000 UTC Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.838829 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.841664 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34"} Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.841723 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.841766 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.841826 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.841856 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.842848 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.842919 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.842938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.843178 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.843218 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.843239 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.843608 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.843644 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:29 crc kubenswrapper[4775]: I0127 11:20:29.843654 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:30 crc kubenswrapper[4775]: I0127 11:20:30.131186 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 27 11:20:30 crc kubenswrapper[4775]: I0127 11:20:30.521882 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:30 crc kubenswrapper[4775]: I0127 11:20:30.684789 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 01:47:24.479587071 +0000 UTC Jan 27 11:20:30 crc kubenswrapper[4775]: I0127 11:20:30.844951 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:20:30 crc kubenswrapper[4775]: I0127 11:20:30.845060 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:30 crc kubenswrapper[4775]: I0127 11:20:30.844958 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:30 crc kubenswrapper[4775]: I0127 11:20:30.846712 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:30 crc kubenswrapper[4775]: I0127 11:20:30.846738 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:30 crc kubenswrapper[4775]: I0127 11:20:30.846766 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:30 crc kubenswrapper[4775]: I0127 11:20:30.846772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:30 crc kubenswrapper[4775]: I0127 11:20:30.846783 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:30 crc kubenswrapper[4775]: I0127 11:20:30.846792 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.446396 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.446661 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.447996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.448056 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.448073 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.685130 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 19:07:53.46731658 +0000 UTC Jan 27 11:20:31 crc kubenswrapper[4775]: E0127 11:20:31.837081 4775 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.846424 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.846498 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.847317 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.847354 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.847365 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.960826 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.961099 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.962668 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.962729 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:31 crc kubenswrapper[4775]: I0127 11:20:31.962747 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:32 crc kubenswrapper[4775]: I0127 11:20:32.106158 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:32 crc kubenswrapper[4775]: I0127 11:20:32.685793 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 22:52:36.291374187 +0000 UTC Jan 27 11:20:32 crc kubenswrapper[4775]: I0127 11:20:32.849970 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:32 crc kubenswrapper[4775]: I0127 11:20:32.851314 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:32 crc kubenswrapper[4775]: I0127 11:20:32.851395 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:32 crc kubenswrapper[4775]: I0127 11:20:32.851431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:33 crc kubenswrapper[4775]: I0127 11:20:33.686302 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 17:26:50.90700261 +0000 UTC Jan 27 11:20:34 crc kubenswrapper[4775]: I0127 11:20:34.549814 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:34 crc kubenswrapper[4775]: I0127 11:20:34.551229 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:34 crc kubenswrapper[4775]: I0127 11:20:34.551288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:34 crc kubenswrapper[4775]: I0127 11:20:34.551307 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:34 crc kubenswrapper[4775]: I0127 11:20:34.551342 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 11:20:34 crc kubenswrapper[4775]: I0127 11:20:34.686846 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 13:23:10.787348889 +0000 UTC Jan 27 11:20:35 crc kubenswrapper[4775]: I0127 11:20:35.686951 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 02:22:10.710456637 +0000 UTC Jan 27 11:20:35 crc kubenswrapper[4775]: I0127 11:20:35.704263 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 27 11:20:36 crc kubenswrapper[4775]: I0127 11:20:36.031230 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:36 crc kubenswrapper[4775]: I0127 11:20:36.031858 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:36 crc kubenswrapper[4775]: I0127 11:20:36.033353 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:36 crc kubenswrapper[4775]: I0127 11:20:36.033393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:36 crc kubenswrapper[4775]: I0127 11:20:36.033405 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:36 crc kubenswrapper[4775]: I0127 11:20:36.687843 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 11:36:20.122665657 +0000 UTC Jan 27 11:20:37 crc kubenswrapper[4775]: I0127 11:20:37.688165 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 06:59:30.243067354 +0000 UTC Jan 27 11:20:38 crc kubenswrapper[4775]: I0127 11:20:38.688230 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 02:02:59.704966777 +0000 UTC Jan 27 11:20:38 crc kubenswrapper[4775]: W0127 11:20:38.961108 4775 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 27 11:20:38 crc kubenswrapper[4775]: I0127 11:20:38.961232 4775 trace.go:236] Trace[1488209893]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 11:20:28.959) (total time: 10001ms): Jan 27 11:20:38 crc kubenswrapper[4775]: Trace[1488209893]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (11:20:38.961) Jan 27 11:20:38 crc kubenswrapper[4775]: Trace[1488209893]: [10.001598797s] [10.001598797s] END Jan 27 11:20:38 crc kubenswrapper[4775]: E0127 11:20:38.961260 4775 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 27 11:20:39 crc kubenswrapper[4775]: I0127 11:20:39.234837 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 27 11:20:39 crc kubenswrapper[4775]: I0127 11:20:39.235185 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 27 11:20:39 crc kubenswrapper[4775]: I0127 11:20:39.241276 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 27 11:20:39 crc kubenswrapper[4775]: I0127 11:20:39.241359 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 27 11:20:39 crc kubenswrapper[4775]: I0127 11:20:39.407570 4775 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 27 11:20:39 crc kubenswrapper[4775]: I0127 11:20:39.407880 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 27 11:20:39 crc kubenswrapper[4775]: I0127 11:20:39.688762 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 23:08:25.742191869 +0000 UTC Jan 27 11:20:40 crc kubenswrapper[4775]: I0127 11:20:40.689796 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 14:06:15.049087387 +0000 UTC Jan 27 11:20:41 crc kubenswrapper[4775]: I0127 11:20:41.690038 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 14:13:52.027442024 +0000 UTC Jan 27 11:20:41 crc kubenswrapper[4775]: E0127 11:20:41.837145 4775 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 27 11:20:41 crc kubenswrapper[4775]: I0127 11:20:41.985107 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 27 11:20:41 crc kubenswrapper[4775]: I0127 11:20:41.985261 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:41 crc kubenswrapper[4775]: I0127 11:20:41.986153 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:41 crc kubenswrapper[4775]: I0127 11:20:41.986175 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:41 crc kubenswrapper[4775]: I0127 11:20:41.986184 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:41 crc kubenswrapper[4775]: I0127 11:20:41.999234 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 27 11:20:42 crc kubenswrapper[4775]: I0127 11:20:42.107341 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 27 11:20:42 crc kubenswrapper[4775]: I0127 11:20:42.107785 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 27 11:20:42 crc kubenswrapper[4775]: I0127 11:20:42.690891 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 05:49:01.004458396 +0000 UTC Jan 27 11:20:42 crc kubenswrapper[4775]: I0127 11:20:42.873975 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:42 crc kubenswrapper[4775]: I0127 11:20:42.874896 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:42 crc kubenswrapper[4775]: I0127 11:20:42.874943 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:42 crc kubenswrapper[4775]: I0127 11:20:42.874959 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.403175 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.403249 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.669080 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.669268 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.669652 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.669697 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.670202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.670228 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.670260 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.673656 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.692035 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 09:02:23.39536918 +0000 UTC Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.875774 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.876779 4775 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.876841 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.876937 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.876985 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:43 crc kubenswrapper[4775]: I0127 11:20:43.876997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.228038 4775 trace.go:236] Trace[502768675]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 11:20:29.720) (total time: 14507ms): Jan 27 11:20:44 crc kubenswrapper[4775]: Trace[502768675]: ---"Objects listed" error: 14507ms (11:20:44.227) Jan 27 11:20:44 crc kubenswrapper[4775]: Trace[502768675]: [14.507100201s] [14.507100201s] END Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.228391 4775 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.229230 4775 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.230428 4775 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.230502 4775 trace.go:236] Trace[222843817]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 11:20:31.716) (total time: 12514ms): Jan 27 11:20:44 crc kubenswrapper[4775]: Trace[222843817]: ---"Objects listed" error: 12514ms (11:20:44.230) Jan 27 11:20:44 crc kubenswrapper[4775]: Trace[222843817]: [12.514110535s] [12.514110535s] END Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.230529 4775 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.230902 4775 trace.go:236] Trace[1111838512]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 11:20:31.712) (total time: 12518ms): Jan 27 11:20:44 crc kubenswrapper[4775]: Trace[1111838512]: ---"Objects listed" error: 12518ms (11:20:44.230) Jan 27 11:20:44 crc kubenswrapper[4775]: Trace[1111838512]: [12.518236096s] [12.518236096s] END Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.230924 4775 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.231700 4775 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.257717 4775 csr.go:261] certificate signing request csr-9nmvr is approved, waiting to be issued Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.265398 4775 csr.go:257] certificate signing request csr-9nmvr is issued Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.663055 4775 apiserver.go:52] "Watching apiserver" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.667138 4775 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.667513 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-vxn5f","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.667867 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.668010 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.668067 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.668154 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.668152 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.668226 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.668287 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.668393 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.668850 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.668932 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.672077 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.676180 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.676187 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.677009 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.677192 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.677561 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.677580 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.677773 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.678242 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.679238 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.679343 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.684069 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.686752 4775 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.692170 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 19:03:34.710087269 +0000 UTC Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.699934 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.729366 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734708 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734764 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734782 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734799 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734821 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734835 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734849 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734864 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734880 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734893 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734907 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734923 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734936 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734952 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734965 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734981 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.734994 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735011 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735028 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735143 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735162 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735177 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735191 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735207 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735221 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735255 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735271 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735296 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735314 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735328 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735344 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735358 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735377 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735398 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735419 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735434 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735480 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735497 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735511 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735526 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735541 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735556 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735570 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735589 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735616 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735637 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735653 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735668 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735682 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735696 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735683 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735711 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735789 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735814 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735837 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735861 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735886 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735906 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735926 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735947 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735967 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.735990 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736010 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736031 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736051 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736071 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736092 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736112 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736133 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736153 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736176 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736196 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736216 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736235 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736256 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736275 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736295 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736319 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736339 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736360 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736378 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736435 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736478 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736500 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736528 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736556 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736579 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736600 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736620 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736641 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736662 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736682 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736705 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736726 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736746 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736766 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736790 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736867 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736942 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736966 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.736988 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737012 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737070 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737095 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737117 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737138 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737158 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737179 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737200 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737221 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737244 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737267 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737289 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737312 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737333 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737355 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737398 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737421 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737493 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737520 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737546 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737568 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737595 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737620 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737641 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737702 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737753 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737805 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737832 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737856 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737878 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737929 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737952 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737976 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738001 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738026 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738049 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738072 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738095 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738119 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738146 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738200 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738226 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738249 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738274 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738297 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738322 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738344 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738368 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738390 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738413 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738439 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738679 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738704 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738728 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738750 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738773 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738796 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738819 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738843 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738866 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738889 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738911 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738933 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738955 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738980 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739003 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739026 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739049 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739081 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739108 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739132 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739155 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739177 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739202 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739225 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739248 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739273 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739300 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739323 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739348 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739371 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739399 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739421 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739468 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739493 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739517 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739541 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739566 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739590 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739615 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739639 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739662 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739685 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739708 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739733 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739757 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739807 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758545 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.737874 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738069 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738237 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738604 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738775 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.738885 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739134 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739289 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.739883 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:20:45.239855338 +0000 UTC m=+24.381453205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.760298 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.762615 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.762682 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.739935 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.740223 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.740304 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.740371 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.740598 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.740635 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.740688 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.740876 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.741039 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.741072 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.741183 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.741232 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.743130 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.743980 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.744289 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.744583 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.749307 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.749320 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.749580 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.749612 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.750152 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.750311 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.750705 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.751007 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.752059 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.752329 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.752529 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.752990 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.754812 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.755017 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.755089 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.755614 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.755682 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.755789 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.755950 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.756291 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.756299 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.756369 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.756547 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.756905 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.756925 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.757117 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.757138 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.757293 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.757316 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.757601 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.757764 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.757875 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758088 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758158 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758174 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758266 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758377 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758474 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758687 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.775133 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.776541 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.776703 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.776779 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.776856 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777009 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777257 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777288 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777267 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777275 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777371 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777620 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777688 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777717 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.758748 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777880 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777932 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.777981 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778043 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778194 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778306 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778317 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778310 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778525 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778784 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778764 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.778985 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.779296 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.779914 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780029 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780042 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.779879 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780186 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780253 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780346 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780398 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780442 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780487 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780513 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780544 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780569 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780598 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780626 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780655 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780680 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780705 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c38486b-7aef-4d58-8637-207994a976d9-hosts-file\") pod \"node-resolver-vxn5f\" (UID: \"0c38486b-7aef-4d58-8637-207994a976d9\") " pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780727 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2wmf\" (UniqueName: \"kubernetes.io/projected/0c38486b-7aef-4d58-8637-207994a976d9-kube-api-access-c2wmf\") pod \"node-resolver-vxn5f\" (UID: \"0c38486b-7aef-4d58-8637-207994a976d9\") " pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780753 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780836 4775 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780853 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780868 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780882 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780894 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780512 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780571 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.780675 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780945 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.780997 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:45.28097637 +0000 UTC m=+24.422574217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780907 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.780752 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.781041 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.781071 4775 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.781102 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.781604 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.781871 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.781933 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:45.281922086 +0000 UTC m=+24.423519873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.781977 4775 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.781996 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782010 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782026 4775 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782040 4775 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782054 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782067 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782080 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782094 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782108 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782120 4775 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782135 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782134 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782148 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782207 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782227 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782244 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782258 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782271 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782286 4775 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782301 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782316 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782329 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782343 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782357 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782356 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782371 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782386 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782399 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782412 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782425 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782439 4775 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782474 4775 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782487 4775 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782501 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782514 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782527 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782556 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782569 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782587 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782601 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782615 4775 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782628 4775 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782641 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782653 4775 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782664 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782676 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782688 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782700 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782714 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782729 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782742 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782756 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782769 4775 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782780 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782792 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782804 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782816 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782827 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782923 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782943 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782945 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782956 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782982 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.782997 4775 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783009 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783021 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783033 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783045 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783056 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783067 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783080 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783082 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783092 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783135 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783148 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783160 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783172 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783182 4775 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783193 4775 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783202 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783212 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783222 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783232 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783242 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783360 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783370 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783380 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783391 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783402 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783412 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783427 4775 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783436 4775 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783461 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783470 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783479 4775 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783488 4775 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783496 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783153 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.783073 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.784088 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.784191 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.785090 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.785463 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.786300 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.786646 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.786839 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.787123 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.787139 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.787571 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.787677 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.788211 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.788770 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.789103 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.789439 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.789508 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.789805 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.789810 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.790149 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.790677 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.790859 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.790877 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.790904 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.791334 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.791397 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.791663 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.791962 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.792000 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.792339 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.792712 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.792832 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.793154 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.793192 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.793414 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.793520 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.793547 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.793562 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.793567 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.793616 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:45.293598399 +0000 UTC m=+24.435196266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.793736 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.793851 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.794262 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.794361 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.794793 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795175 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795178 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795289 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795349 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795604 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795650 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795655 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795699 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.796002 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.795980 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.796067 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.796211 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.796234 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.796247 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.796206 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: E0127 11:20:44.796295 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:45.2962764 +0000 UTC m=+24.437874177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.796326 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.796962 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.798224 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.798617 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.798950 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.798963 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.799165 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.799680 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.799750 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.799739 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800046 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800147 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800495 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800243 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800683 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800612 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800855 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800862 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800880 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800867 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.800897 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.801617 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.801694 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.801718 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.801972 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.802008 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.802024 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.802372 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.802804 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.803436 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.805479 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.806036 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.806244 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.807535 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.808390 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.809969 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.817812 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.820769 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.824073 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.830989 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.841138 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.848331 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.884261 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.884397 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.884601 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.884561 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.884779 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c38486b-7aef-4d58-8637-207994a976d9-hosts-file\") pod \"node-resolver-vxn5f\" (UID: \"0c38486b-7aef-4d58-8637-207994a976d9\") " pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.884845 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2wmf\" (UniqueName: \"kubernetes.io/projected/0c38486b-7aef-4d58-8637-207994a976d9-kube-api-access-c2wmf\") pod \"node-resolver-vxn5f\" (UID: \"0c38486b-7aef-4d58-8637-207994a976d9\") " pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885008 4775 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885034 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885069 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885083 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885094 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885105 4775 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885104 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0c38486b-7aef-4d58-8637-207994a976d9-hosts-file\") pod \"node-resolver-vxn5f\" (UID: \"0c38486b-7aef-4d58-8637-207994a976d9\") " pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885115 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885148 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885163 4775 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885173 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885186 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885196 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885229 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885243 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885254 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885267 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885277 4775 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885311 4775 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885324 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885335 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885355 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885370 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885379 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885388 4775 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885397 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885407 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885415 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885423 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885441 4775 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885463 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885471 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885479 4775 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885487 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885495 4775 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885502 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885510 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885519 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885526 4775 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885534 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885542 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885549 4775 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885558 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885566 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885575 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885582 4775 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885590 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885604 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885613 4775 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885622 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885630 4775 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885639 4775 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885648 4775 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885656 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885665 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885675 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885683 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885691 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885698 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885707 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885714 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885722 4775 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885731 4775 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885740 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885753 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885761 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885769 4775 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885778 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885786 4775 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885794 4775 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885802 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885810 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885819 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885827 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885835 4775 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885844 4775 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885853 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885860 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885868 4775 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885876 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885884 4775 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885892 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885899 4775 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885907 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885915 4775 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885923 4775 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885930 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885938 4775 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885946 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885954 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885963 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885974 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885983 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885991 4775 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.885999 4775 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.902779 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2wmf\" (UniqueName: \"kubernetes.io/projected/0c38486b-7aef-4d58-8637-207994a976d9-kube-api-access-c2wmf\") pod \"node-resolver-vxn5f\" (UID: \"0c38486b-7aef-4d58-8637-207994a976d9\") " pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.979865 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 11:20:44 crc kubenswrapper[4775]: I0127 11:20:44.988103 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 11:20:44 crc kubenswrapper[4775]: W0127 11:20:44.997963 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-024f8b40b424e2cec8aa68c61b3ab47a9cd6ab8d23347a81db63b205cdd93969 WatchSource:0}: Error finding container 024f8b40b424e2cec8aa68c61b3ab47a9cd6ab8d23347a81db63b205cdd93969: Status 404 returned error can't find the container with id 024f8b40b424e2cec8aa68c61b3ab47a9cd6ab8d23347a81db63b205cdd93969 Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.000609 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-vxn5f" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.008754 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 11:20:45 crc kubenswrapper[4775]: W0127 11:20:45.015020 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c38486b_7aef_4d58_8637_207994a976d9.slice/crio-74ab5ea2cf0a4be54ed63669cfd552952379470c18c5453408e37e3be8225f4e WatchSource:0}: Error finding container 74ab5ea2cf0a4be54ed63669cfd552952379470c18c5453408e37e3be8225f4e: Status 404 returned error can't find the container with id 74ab5ea2cf0a4be54ed63669cfd552952379470c18c5453408e37e3be8225f4e Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.267279 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-27 11:15:44 +0000 UTC, rotation deadline is 2026-11-19 17:32:48.613122666 +0000 UTC Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.267640 4775 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7110h12m3.345485877s for next certificate rotation Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.289907 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.290018 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.290053 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:20:46.29002917 +0000 UTC m=+25.431626947 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.290085 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.290114 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.290144 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:46.290126522 +0000 UTC m=+25.431724379 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.290207 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.290246 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:46.290239845 +0000 UTC m=+25.431837622 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.390529 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.390580 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390676 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390689 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390699 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390709 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390733 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390745 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390745 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:46.39073179 +0000 UTC m=+25.532329567 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:45 crc kubenswrapper[4775]: E0127 11:20:45.390792 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:46.390780041 +0000 UTC m=+25.532377828 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.693314 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 14:29:02.781739905 +0000 UTC Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.753952 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.754680 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.755732 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.756304 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.757260 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.757750 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.758328 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.759243 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.759831 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.760715 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.761305 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.762305 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.762777 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.763257 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.764105 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.764618 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.765567 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.766008 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.766557 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.767568 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.767991 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.768899 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.769330 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.770284 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.770767 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.771405 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.772663 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.773171 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.774339 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.774986 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.775932 4775 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.776040 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.778019 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.779248 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.779737 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.781550 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.782218 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.783080 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.783779 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.784985 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.785539 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.786906 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.787652 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.788850 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.789443 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.790561 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.791225 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.792820 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.793475 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.794321 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.794884 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.795845 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.796660 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.797211 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.847891 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-gm7w4"] Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.848166 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-qn99x"] Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.848352 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.848410 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.848619 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-dcnmf"] Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.849669 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.852321 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.852577 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.852764 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.852795 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.852855 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.852917 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.853990 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.854065 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.854078 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.854089 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.854364 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.855064 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.879812 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.881040 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"9f343f88d507d23485006b27c70c1d80eaeefbc7c76be208875b5f630ef916a1"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.882611 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.882686 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.882718 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"024f8b40b424e2cec8aa68c61b3ab47a9cd6ab8d23347a81db63b205cdd93969"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.883837 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.883877 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d8bd653fa4bf5063bd06102c9ce039294f6da0d28d294372f3deebdaf672147a"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.885834 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.886226 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.887979 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34" exitCode=255 Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.888051 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.888129 4775 scope.go:117] "RemoveContainer" containerID="67a870569d400ca1948934b792b55f3145d18677c2ac71aa327602e4e18e182f" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.889937 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vxn5f" event={"ID":"0c38486b-7aef-4d58-8637-207994a976d9","Type":"ContainerStarted","Data":"7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.889984 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-vxn5f" event={"ID":"0c38486b-7aef-4d58-8637-207994a976d9","Type":"ContainerStarted","Data":"74ab5ea2cf0a4be54ed63669cfd552952379470c18c5453408e37e3be8225f4e"} Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.893847 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cni-binary-copy\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.893888 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-cni-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.893908 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-kubelet\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.893930 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-multus-certs\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.893978 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894007 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-cnibin\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894049 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-cni-bin\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894063 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-hostroot\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894078 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cnibin\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894096 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj6q4\" (UniqueName: \"kubernetes.io/projected/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-kube-api-access-mj6q4\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894112 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-socket-dir-parent\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894163 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-system-cni-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894179 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-netns\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894196 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj4jn\" (UniqueName: \"kubernetes.io/projected/aba2edc6-0e64-4995-830d-e177919ea13e-kube-api-access-pj4jn\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894210 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-os-release\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894225 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aba2edc6-0e64-4995-830d-e177919ea13e-cni-binary-copy\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894300 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-k8s-cni-cncf-io\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894345 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tdkh\" (UniqueName: \"kubernetes.io/projected/7707cf23-0a23-4f57-8184-f7a4f7587aa2-kube-api-access-2tdkh\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894403 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-system-cni-dir\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894429 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-cni-multus\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894499 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-conf-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894643 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-etc-kubernetes\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894690 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aba2edc6-0e64-4995-830d-e177919ea13e-multus-daemon-config\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.894949 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7707cf23-0a23-4f57-8184-f7a4f7587aa2-rootfs\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.895012 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7707cf23-0a23-4f57-8184-f7a4f7587aa2-proxy-tls\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.895034 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7707cf23-0a23-4f57-8184-f7a4f7587aa2-mcd-auth-proxy-config\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.895129 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.895203 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-os-release\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.898386 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.910757 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.920970 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.934051 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.949370 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.963294 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995161 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:45Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995626 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cni-binary-copy\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995661 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-cni-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995684 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-kubelet\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995707 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-multus-certs\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995731 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995753 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-cnibin\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995795 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-cni-bin\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995816 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-hostroot\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995835 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cnibin\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995838 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-multus-certs\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995861 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-cni-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995857 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj6q4\" (UniqueName: \"kubernetes.io/projected/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-kube-api-access-mj6q4\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995900 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-cni-bin\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995917 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-hostroot\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995924 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-socket-dir-parent\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995952 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-system-cni-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995956 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cnibin\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995977 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-netns\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995955 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-cnibin\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995999 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj4jn\" (UniqueName: \"kubernetes.io/projected/aba2edc6-0e64-4995-830d-e177919ea13e-kube-api-access-pj4jn\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996008 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-netns\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.995979 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-socket-dir-parent\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996021 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-os-release\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996048 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aba2edc6-0e64-4995-830d-e177919ea13e-cni-binary-copy\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996074 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-k8s-cni-cncf-io\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996096 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tdkh\" (UniqueName: \"kubernetes.io/projected/7707cf23-0a23-4f57-8184-f7a4f7587aa2-kube-api-access-2tdkh\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996106 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-system-cni-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996120 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-system-cni-dir\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996143 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-cni-multus\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996150 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-run-k8s-cni-cncf-io\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996166 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-conf-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996181 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-system-cni-dir\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996218 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-multus-conf-dir\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996230 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-cni-multus\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996238 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-etc-kubernetes\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996266 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-etc-kubernetes\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996291 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aba2edc6-0e64-4995-830d-e177919ea13e-multus-daemon-config\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996321 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7707cf23-0a23-4f57-8184-f7a4f7587aa2-rootfs\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996344 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7707cf23-0a23-4f57-8184-f7a4f7587aa2-proxy-tls\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996366 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7707cf23-0a23-4f57-8184-f7a4f7587aa2-mcd-auth-proxy-config\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996396 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996401 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7707cf23-0a23-4f57-8184-f7a4f7587aa2-rootfs\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996418 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-os-release\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996510 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-os-release\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.997294 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.997470 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7707cf23-0a23-4f57-8184-f7a4f7587aa2-mcd-auth-proxy-config\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.997496 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/aba2edc6-0e64-4995-830d-e177919ea13e-multus-daemon-config\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.997500 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.997533 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-cni-binary-copy\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.997515 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-os-release\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.996644 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aba2edc6-0e64-4995-830d-e177919ea13e-host-var-lib-kubelet\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:45 crc kubenswrapper[4775]: I0127 11:20:45.999129 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aba2edc6-0e64-4995-830d-e177919ea13e-cni-binary-copy\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.006294 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7707cf23-0a23-4f57-8184-f7a4f7587aa2-proxy-tls\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.020056 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.020410 4775 scope.go:117] "RemoveContainer" containerID="96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.020710 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.021145 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.021877 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tdkh\" (UniqueName: \"kubernetes.io/projected/7707cf23-0a23-4f57-8184-f7a4f7587aa2-kube-api-access-2tdkh\") pod \"machine-config-daemon-qn99x\" (UID: \"7707cf23-0a23-4f57-8184-f7a4f7587aa2\") " pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.021951 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj6q4\" (UniqueName: \"kubernetes.io/projected/404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d-kube-api-access-mj6q4\") pod \"multus-additional-cni-plugins-dcnmf\" (UID: \"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\") " pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.039637 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj4jn\" (UniqueName: \"kubernetes.io/projected/aba2edc6-0e64-4995-830d-e177919ea13e-kube-api-access-pj4jn\") pod \"multus-gm7w4\" (UID: \"aba2edc6-0e64-4995-830d-e177919ea13e\") " pod="openshift-multus/multus-gm7w4" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.061998 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.092649 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.113686 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.135869 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.147718 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.158774 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.161385 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gm7w4" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.170023 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.172414 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.176582 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" Jan 27 11:20:46 crc kubenswrapper[4775]: W0127 11:20:46.181635 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7707cf23_0a23_4f57_8184_f7a4f7587aa2.slice/crio-412506a9b82a5cd070407ab85f098969d63083ac651b0be6a3b9fb4107f70455 WatchSource:0}: Error finding container 412506a9b82a5cd070407ab85f098969d63083ac651b0be6a3b9fb4107f70455: Status 404 returned error can't find the container with id 412506a9b82a5cd070407ab85f098969d63083ac651b0be6a3b9fb4107f70455 Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.186292 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: W0127 11:20:46.187592 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod404c5bcc_dd1d_479b_8ce2_2b9fd6f2db9d.slice/crio-3c7803a6a3513b177a0362fb2939558506b8a446c20d328fb207dcfb42eb1ee7 WatchSource:0}: Error finding container 3c7803a6a3513b177a0362fb2939558506b8a446c20d328fb207dcfb42eb1ee7: Status 404 returned error can't find the container with id 3c7803a6a3513b177a0362fb2939558506b8a446c20d328fb207dcfb42eb1ee7 Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.201948 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.214036 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nzthg"] Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.214820 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.217967 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.218082 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.217972 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.218157 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.218107 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.218255 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.221811 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.232215 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.245158 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.264900 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.275828 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.292192 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.298245 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:20:48.298217352 +0000 UTC m=+27.439815119 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.298718 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.298865 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-systemd-units\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.298893 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-slash\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.298916 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-netd\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.298939 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovn-node-metrics-cert\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.298964 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299000 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299021 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-kubelet\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299041 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-config\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299067 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-netns\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299088 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-env-overrides\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.299068 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.299147 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:48.299138626 +0000 UTC m=+27.440736403 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299110 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299201 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-node-log\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299224 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-var-lib-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299241 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-etc-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299256 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-script-lib\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299277 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299319 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-systemd\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299343 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-log-socket\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299361 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-bin\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299385 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299407 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-ovn\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.299424 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.299464 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czdm4\" (UniqueName: \"kubernetes.io/projected/7d657d41-09b6-43f2-babb-4cb13a62fd1f-kube-api-access-czdm4\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.299547 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:48.299535497 +0000 UTC m=+27.441133364 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.309643 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.329601 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.347047 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.362201 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.373420 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.387345 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400466 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-systemd-units\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400511 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-slash\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400538 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-netd\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400563 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovn-node-metrics-cert\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400585 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400619 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400626 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-slash\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400645 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-kubelet\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400696 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-config\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400697 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-netd\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400730 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-netns\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400760 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-env-overrides\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400784 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400797 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400835 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-node-log\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.400838 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.400872 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.400886 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.400935 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:48.400917825 +0000 UTC m=+27.542515802 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401310 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-netns\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401350 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-systemd-units\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400672 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-kubelet\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401402 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-ovn-kubernetes\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.400811 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-node-log\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401774 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-var-lib-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401839 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-etc-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401871 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401889 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-config\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401894 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-systemd\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401922 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-systemd\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401958 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-script-lib\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401985 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-etc-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401989 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-log-socket\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402012 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-log-socket\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402013 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-env-overrides\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402031 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-bin\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.401961 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-var-lib-openvswitch\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402082 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.402155 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.402178 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.402191 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402196 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-bin\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402213 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.402239 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:48.40222316 +0000 UTC m=+27.543821127 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402265 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czdm4\" (UniqueName: \"kubernetes.io/projected/7d657d41-09b6-43f2-babb-4cb13a62fd1f-kube-api-access-czdm4\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402296 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-ovn\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402360 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-ovn\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.402556 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-script-lib\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.405840 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67a870569d400ca1948934b792b55f3145d18677c2ac71aa327602e4e18e182f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:28Z\\\",\\\"message\\\":\\\"W0127 11:20:27.916104 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 11:20:27.916390 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769512827 cert, and key in /tmp/serving-cert-3044175487/serving-signer.crt, /tmp/serving-cert-3044175487/serving-signer.key\\\\nI0127 11:20:28.348355 1 observer_polling.go:159] Starting file observer\\\\nW0127 11:20:28.350725 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 11:20:28.350887 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:28.352887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3044175487/tls.crt::/tmp/serving-cert-3044175487/tls.key\\\\\\\"\\\\nF0127 11:20:28.690147 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.410148 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovn-node-metrics-cert\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.414107 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.418723 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.423908 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czdm4\" (UniqueName: \"kubernetes.io/projected/7d657d41-09b6-43f2-babb-4cb13a62fd1f-kube-api-access-czdm4\") pod \"ovnkube-node-nzthg\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.427523 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.427796 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.439628 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.453725 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67a870569d400ca1948934b792b55f3145d18677c2ac71aa327602e4e18e182f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:28Z\\\",\\\"message\\\":\\\"W0127 11:20:27.916104 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 11:20:27.916390 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769512827 cert, and key in /tmp/serving-cert-3044175487/serving-signer.crt, /tmp/serving-cert-3044175487/serving-signer.key\\\\nI0127 11:20:28.348355 1 observer_polling.go:159] Starting file observer\\\\nW0127 11:20:28.350725 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 11:20:28.350887 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:28.352887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3044175487/tls.crt::/tmp/serving-cert-3044175487/tls.key\\\\\\\"\\\\nF0127 11:20:28.690147 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.467318 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.482650 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.496306 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.513620 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.532486 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.533743 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.557172 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.578754 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.595514 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.610746 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.628814 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67a870569d400ca1948934b792b55f3145d18677c2ac71aa327602e4e18e182f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:28Z\\\",\\\"message\\\":\\\"W0127 11:20:27.916104 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 11:20:27.916390 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769512827 cert, and key in /tmp/serving-cert-3044175487/serving-signer.crt, /tmp/serving-cert-3044175487/serving-signer.key\\\\nI0127 11:20:28.348355 1 observer_polling.go:159] Starting file observer\\\\nW0127 11:20:28.350725 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 11:20:28.350887 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:28.352887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3044175487/tls.crt::/tmp/serving-cert-3044175487/tls.key\\\\\\\"\\\\nF0127 11:20:28.690147 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.646063 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.660976 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.673201 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.685541 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.694523 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 16:17:34.211602619 +0000 UTC Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.698221 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.710136 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.721434 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.735338 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.744349 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.744412 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.744369 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.744504 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.744663 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.744866 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.749613 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.765685 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.791175 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.817187 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.894120 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.894185 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.894201 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"412506a9b82a5cd070407ab85f098969d63083ac651b0be6a3b9fb4107f70455"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.895272 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerStarted","Data":"e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.895296 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerStarted","Data":"88c6cc63acfc378ddb4f98b32e64b6cc2284716135203b6082128b8b97604592"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.897178 4775 generic.go:334] "Generic (PLEG): container finished" podID="404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d" containerID="9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c" exitCode=0 Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.897242 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerDied","Data":"9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.897270 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerStarted","Data":"3c7803a6a3513b177a0362fb2939558506b8a446c20d328fb207dcfb42eb1ee7"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.899612 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.902528 4775 scope.go:117] "RemoveContainer" containerID="96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34" Jan 27 11:20:46 crc kubenswrapper[4775]: E0127 11:20:46.902791 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.903434 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5" exitCode=0 Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.903481 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.903527 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"f888bf350c80a3614a432edcc4a4b855273dcb2c8f4a4adedcb465a13b969229"} Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.910745 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.928847 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.945057 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.954903 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.978888 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:46 crc kubenswrapper[4775]: I0127 11:20:46.993899 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:46Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.006987 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.021337 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.039296 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.053748 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67a870569d400ca1948934b792b55f3145d18677c2ac71aa327602e4e18e182f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:28Z\\\",\\\"message\\\":\\\"W0127 11:20:27.916104 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0127 11:20:27.916390 1 crypto.go:601] Generating new CA for check-endpoints-signer@1769512827 cert, and key in /tmp/serving-cert-3044175487/serving-signer.crt, /tmp/serving-cert-3044175487/serving-signer.key\\\\nI0127 11:20:28.348355 1 observer_polling.go:159] Starting file observer\\\\nW0127 11:20:28.350725 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 11:20:28.350887 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:28.352887 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3044175487/tls.crt::/tmp/serving-cert-3044175487/tls.key\\\\\\\"\\\\nF0127 11:20:28.690147 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.071622 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.083106 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.096016 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.126207 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.129676 4775 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.183906 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.220004 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.259537 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.299633 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.341025 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.379144 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.418536 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.459113 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.497137 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.539353 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.578385 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.616682 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.694858 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 05:37:20.085252434 +0000 UTC Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.909977 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.911063 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044" exitCode=1 Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.911160 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.911188 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.911200 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.911208 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.911216 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.911225 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.912967 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerStarted","Data":"7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.914025 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf"} Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.925264 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.936250 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.949652 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.965472 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.976836 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.987749 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:47 crc kubenswrapper[4775]: I0127 11:20:47.998715 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:47Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.008786 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.022617 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.036228 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.059936 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.099185 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.141583 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.182804 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.219758 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.259436 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.309046 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.320012 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.320162 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.320218 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:20:52.320187536 +0000 UTC m=+31.461785333 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.320275 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.320283 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.320354 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:52.32033413 +0000 UTC m=+31.461931967 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.320428 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.320503 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:52.320492084 +0000 UTC m=+31.462089941 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.345667 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.378816 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.421013 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.421064 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421205 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421223 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421235 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421235 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421257 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421268 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421289 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:52.421271786 +0000 UTC m=+31.562869563 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.421319 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:52.421299537 +0000 UTC m=+31.562897324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.423107 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.430110 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-9dz9r"] Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.430467 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.450145 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.470386 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.490936 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.510749 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.521618 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c1ce49b6-6832-4f61-bad3-63174f36eba9-serviceca\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.521656 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1ce49b6-6832-4f61-bad3-63174f36eba9-host\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.521677 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhgjw\" (UniqueName: \"kubernetes.io/projected/c1ce49b6-6832-4f61-bad3-63174f36eba9-kube-api-access-hhgjw\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.538079 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.578931 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.617814 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.621948 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1ce49b6-6832-4f61-bad3-63174f36eba9-host\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.621982 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhgjw\" (UniqueName: \"kubernetes.io/projected/c1ce49b6-6832-4f61-bad3-63174f36eba9-kube-api-access-hhgjw\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.622051 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c1ce49b6-6832-4f61-bad3-63174f36eba9-serviceca\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.622073 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c1ce49b6-6832-4f61-bad3-63174f36eba9-host\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.623112 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c1ce49b6-6832-4f61-bad3-63174f36eba9-serviceca\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.669033 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhgjw\" (UniqueName: \"kubernetes.io/projected/c1ce49b6-6832-4f61-bad3-63174f36eba9-kube-api-access-hhgjw\") pod \"node-ca-9dz9r\" (UID: \"c1ce49b6-6832-4f61-bad3-63174f36eba9\") " pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.679873 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.696019 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 22:51:59.257235401 +0000 UTC Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.718966 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.744427 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.744464 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.744531 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.744548 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.744627 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:48 crc kubenswrapper[4775]: E0127 11:20:48.744678 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.757195 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.798462 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.837859 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.866315 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9dz9r" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.892105 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.919036 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.920099 4775 generic.go:334] "Generic (PLEG): container finished" podID="404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d" containerID="7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936" exitCode=0 Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.920169 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerDied","Data":"7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936"} Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.921358 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9dz9r" event={"ID":"c1ce49b6-6832-4f61-bad3-63174f36eba9","Type":"ContainerStarted","Data":"f56e3b2430b7fad2b35329c5a732c98eed1bee3e7a01738e269bbfc1ebe4672d"} Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.958595 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:48 crc kubenswrapper[4775]: I0127 11:20:48.999093 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:48Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.037327 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.079785 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.118898 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.158308 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.199739 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.245073 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.276605 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.320306 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.358966 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.397983 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.440685 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.479904 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.522054 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.557072 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.599361 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.639016 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.676453 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.697033 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 21:24:10.303393206 +0000 UTC Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.721498 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.759146 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.798360 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.838641 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.877234 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.926691 4775 generic.go:334] "Generic (PLEG): container finished" podID="404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d" containerID="782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0" exitCode=0 Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.926767 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerDied","Data":"782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0"} Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.928396 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9dz9r" event={"ID":"c1ce49b6-6832-4f61-bad3-63174f36eba9","Type":"ContainerStarted","Data":"280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d"} Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.939316 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.961102 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:49 crc kubenswrapper[4775]: I0127 11:20:49.999426 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:49Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.040838 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.081042 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.127038 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.159813 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.205842 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.241576 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.283261 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.323426 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.362314 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.403263 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.443496 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.485474 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.519736 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.563357 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.597251 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.642964 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.680542 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.697947 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 21:58:00.141605215 +0000 UTC Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.725902 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.744981 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.745079 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:50 crc kubenswrapper[4775]: E0127 11:20:50.745125 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.745089 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:50 crc kubenswrapper[4775]: E0127 11:20:50.745390 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:50 crc kubenswrapper[4775]: E0127 11:20:50.745555 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.761259 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.804179 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.840453 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.880101 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.922902 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.934185 4775 generic.go:334] "Generic (PLEG): container finished" podID="404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d" containerID="d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c" exitCode=0 Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.934268 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerDied","Data":"d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c"} Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.937576 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.939131 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072"} Jan 27 11:20:50 crc kubenswrapper[4775]: I0127 11:20:50.959164 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.000075 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:50Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.039917 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.078185 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.119643 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.156955 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.199966 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.229461 4775 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.233896 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.233974 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.233996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.234541 4775 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.239558 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.291431 4775 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.291759 4775 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.292768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.292806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.292815 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.292830 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.292839 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: E0127 11:20:51.304049 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.308330 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.308371 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.308382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.308397 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.308409 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: E0127 11:20:51.320793 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.320974 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.323906 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.323938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.323948 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.323960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.323970 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: E0127 11:20:51.337525 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.340930 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.340959 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.340970 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.340984 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.340993 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: E0127 11:20:51.353988 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.357424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.357465 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.357494 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.357512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.357525 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.361320 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: E0127 11:20:51.369153 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: E0127 11:20:51.369405 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.371159 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.371200 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.371213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.371232 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.371255 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.405066 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.437874 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.467364 4775 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.481563 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.481658 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.481685 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.481719 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.481743 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.501606 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.521304 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.559993 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.584708 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.584749 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.584759 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.584776 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.584788 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.598771 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.687834 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.687877 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.687893 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.687916 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.687932 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.698743 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 17:26:23.976860249 +0000 UTC Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.754275 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.764161 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.776521 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.785300 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.790122 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.790154 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.790163 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.790176 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.790187 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.796729 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.842119 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.885833 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.892677 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.892716 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.892724 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.892739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.892749 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.922799 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.943320 4775 generic.go:334] "Generic (PLEG): container finished" podID="404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d" containerID="aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e" exitCode=0 Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.943354 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerDied","Data":"aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.967412 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.994779 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.995008 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.995130 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.995251 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.995374 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:51Z","lastTransitionTime":"2026-01-27T11:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:51 crc kubenswrapper[4775]: I0127 11:20:51.997695 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.039672 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.077214 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.098075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.098120 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.098133 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.098151 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.098162 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.118507 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.159538 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.213984 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.217770 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.217815 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.217831 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.217850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.217863 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.243982 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.279288 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.318908 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.320550 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.320577 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.320586 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.320599 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.320610 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.356951 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.357109 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.357186 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.357267 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.357345 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:21:00.357323132 +0000 UTC m=+39.498920919 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.357273 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.357427 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:00.357359013 +0000 UTC m=+39.498956890 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.357473 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:00.357443135 +0000 UTC m=+39.499040992 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.359332 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.400046 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.422916 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.422947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.422954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.422969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.422978 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.444283 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.457583 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.457640 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.457772 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.457961 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.457970 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.457981 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.457985 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.457996 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.458049 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:00.458023282 +0000 UTC m=+39.599621059 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.458065 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:00.458058913 +0000 UTC m=+39.599656690 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.476843 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.519793 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.525510 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.525532 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.525540 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.525552 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.525560 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.559497 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.605890 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.627919 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.627966 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.627978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.628001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.628014 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.638875 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.687887 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.699344 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 20:14:53.691746542 +0000 UTC Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.719015 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.730481 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.730512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.730521 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.730533 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.730541 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.743801 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.743881 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.744058 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.744102 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.744120 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:52 crc kubenswrapper[4775]: E0127 11:20:52.744223 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.833684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.833721 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.833729 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.833745 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.833758 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.936763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.936823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.936842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.936866 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.936883 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:52Z","lastTransitionTime":"2026-01-27T11:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.950589 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.951362 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.951564 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.951596 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.951711 4775 scope.go:117] "RemoveContainer" containerID="da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.956529 4775 generic.go:334] "Generic (PLEG): container finished" podID="404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d" containerID="b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda" exitCode=0 Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.956574 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerDied","Data":"b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda"} Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.968071 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.983823 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.985815 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.986912 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:20:52 crc kubenswrapper[4775]: I0127 11:20:52.996394 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.010030 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.024849 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.039724 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.039760 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.039768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.039781 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.039790 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.041014 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.058698 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.079655 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-acl-logging nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-acl-logging nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.091196 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.118628 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.142076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.142120 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.142133 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.142148 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.142159 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.162983 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.197569 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.237356 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.244535 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.244577 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.244594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.244616 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.244634 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.282639 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.319877 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.346745 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.346780 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.346791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.346806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.346814 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.359113 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.398110 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.403386 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.404605 4775 scope.go:117] "RemoveContainer" containerID="96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34" Jan 27 11:20:53 crc kubenswrapper[4775]: E0127 11:20:53.404880 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.438572 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.449541 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.449577 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.449591 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.449608 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.449619 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.481007 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.519091 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.553009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.553325 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.553497 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.553664 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.553789 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.567759 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.604986 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-acl-logging ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-acl-logging ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.645712 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.657341 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.657372 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.657382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.657394 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.657402 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.681168 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.700039 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 10:51:39.888059413 +0000 UTC Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.722881 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.759165 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.759210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.759227 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.759247 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.759264 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.765342 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.800106 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.838093 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.861655 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.861695 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.861706 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.861725 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.861738 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.964416 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" event={"ID":"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d","Type":"ContainerStarted","Data":"09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.965393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.965431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.965454 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.965494 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.965509 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:53Z","lastTransitionTime":"2026-01-27T11:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.971367 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.972468 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693"} Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.972556 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:20:53 crc kubenswrapper[4775]: I0127 11:20:53.984872 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:53Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.002740 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.015481 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.026936 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.040796 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.068078 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.068141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.068164 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.068196 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.068218 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.078302 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.120035 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.157046 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.171293 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.171357 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.171377 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.171401 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.171417 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.198941 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.240355 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.274328 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.274371 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.274384 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.274401 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.274412 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.283647 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.322285 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.366253 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-acl-logging ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-acl-logging ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.376837 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.377056 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.377071 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.377090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.377104 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.397267 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.439255 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.475722 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.479188 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.479215 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.479223 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.479237 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.479247 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.524260 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.562618 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.582249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.582294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.582308 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.582343 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.582357 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.602899 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.642861 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.685495 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.685542 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.685555 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.685575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.685553 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.685587 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.700588 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 01:07:34.85492499 +0000 UTC Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.720437 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.744900 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.744898 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:54 crc kubenswrapper[4775]: E0127 11:20:54.745058 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:54 crc kubenswrapper[4775]: E0127 11:20:54.745148 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.744903 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:54 crc kubenswrapper[4775]: E0127 11:20:54.745338 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.758342 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.793106 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.793188 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.793213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.793245 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.793271 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.808603 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.861505 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.880133 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.896676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.896748 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.896775 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.896805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.896827 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.958188 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.974109 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:54Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.975516 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.998742 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.998772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.998781 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.998794 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:54 crc kubenswrapper[4775]: I0127 11:20:54.998803 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:54Z","lastTransitionTime":"2026-01-27T11:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.101705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.101759 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.101772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.101792 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.101808 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.204611 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.204654 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.204664 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.204680 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.204691 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.306768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.306828 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.306837 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.306852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.306861 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.409753 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.409808 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.409817 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.409837 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.409846 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.512252 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.512294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.512304 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.512320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.512330 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.614804 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.614854 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.614865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.614885 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.614898 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.701388 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 11:32:35.229483247 +0000 UTC Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.717858 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.717894 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.717938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.717953 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.717987 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.820388 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.820426 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.820436 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.820477 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.820493 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.923331 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.923403 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.923428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.923464 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.923517 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:55Z","lastTransitionTime":"2026-01-27T11:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.982903 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/0.log" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.985871 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.987209 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba" exitCode=1 Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.987275 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba"} Jan 27 11:20:55 crc kubenswrapper[4775]: I0127 11:20:55.988386 4775 scope.go:117] "RemoveContainer" containerID="0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.010655 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.030969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.031066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.031423 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.031511 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.032419 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.032620 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.052031 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.067928 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.090039 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.113111 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.129620 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.135488 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.135676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.135841 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.135979 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.136099 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.146068 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.162885 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.182566 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.203525 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.222250 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.239132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.239187 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.239204 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.239228 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.239246 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.247090 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.261053 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:56Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.342070 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.342329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.342404 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.342509 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.342584 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.444640 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.444710 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.444732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.444754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.444772 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.547337 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.547376 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.547387 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.547402 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.547413 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.649758 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.649852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.649873 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.649901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.649921 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.702219 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 10:05:49.110974385 +0000 UTC Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.744802 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.744890 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:56 crc kubenswrapper[4775]: E0127 11:20:56.744954 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:56 crc kubenswrapper[4775]: E0127 11:20:56.745085 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.744836 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:56 crc kubenswrapper[4775]: E0127 11:20:56.745223 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.753210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.753248 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.753259 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.753275 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.753289 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.856053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.856081 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.856089 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.856101 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.856110 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.958412 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.958453 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.958476 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.958489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.958497 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:56Z","lastTransitionTime":"2026-01-27T11:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.992616 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/0.log" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.994863 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.995372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79"} Jan 27 11:20:56 crc kubenswrapper[4775]: I0127 11:20:56.995495 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.014725 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.027942 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.042948 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.055759 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.060592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.060627 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.060638 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.060653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.060663 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.068563 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.081791 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.092143 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.104320 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.115104 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.127355 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.138854 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.151087 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.163059 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.163099 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.163108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.163123 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.163134 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.168104 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.177864 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.266743 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.266779 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.266793 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.266810 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.266821 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.369313 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.369617 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.369678 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.369741 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.369796 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.472170 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.472206 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.472222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.472244 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.472258 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.574838 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.574878 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.574886 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.574901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.574911 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.677136 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.677307 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.677384 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.677467 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.677524 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.703301 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 04:39:47.626925617 +0000 UTC Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.780691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.780754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.780772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.780795 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.780814 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.883366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.883446 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.883512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.883546 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.883577 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.952661 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5"] Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.953239 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.955210 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.958133 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.978572 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.985744 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.985811 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.985834 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.985862 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.985888 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:57Z","lastTransitionTime":"2026-01-27T11:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:57 crc kubenswrapper[4775]: I0127 11:20:57.993586 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.000570 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/1.log" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.001962 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/0.log" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.004693 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.007768 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79" exitCode=1 Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.007812 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.007848 4775 scope.go:117] "RemoveContainer" containerID="0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.008535 4775 scope.go:117] "RemoveContainer" containerID="437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79" Jan 27 11:20:58 crc kubenswrapper[4775]: E0127 11:20:58.008670 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.013512 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/722c4ef1-b8ec-4732-908b-4c697d7eef60-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.013601 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjhfr\" (UniqueName: \"kubernetes.io/projected/722c4ef1-b8ec-4732-908b-4c697d7eef60-kube-api-access-mjhfr\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.013733 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/722c4ef1-b8ec-4732-908b-4c697d7eef60-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.013782 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/722c4ef1-b8ec-4732-908b-4c697d7eef60-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.017801 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.040416 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.052812 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.068629 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.086919 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.088360 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.088415 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.088434 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.088499 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.088518 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.098495 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.111634 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.114530 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/722c4ef1-b8ec-4732-908b-4c697d7eef60-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.114558 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjhfr\" (UniqueName: \"kubernetes.io/projected/722c4ef1-b8ec-4732-908b-4c697d7eef60-kube-api-access-mjhfr\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.114605 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/722c4ef1-b8ec-4732-908b-4c697d7eef60-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.114628 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/722c4ef1-b8ec-4732-908b-4c697d7eef60-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.115410 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/722c4ef1-b8ec-4732-908b-4c697d7eef60-env-overrides\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.115775 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/722c4ef1-b8ec-4732-908b-4c697d7eef60-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.120064 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/722c4ef1-b8ec-4732-908b-4c697d7eef60-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.125599 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.134898 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjhfr\" (UniqueName: \"kubernetes.io/projected/722c4ef1-b8ec-4732-908b-4c697d7eef60-kube-api-access-mjhfr\") pod \"ovnkube-control-plane-749d76644c-7jxr5\" (UID: \"722c4ef1-b8ec-4732-908b-4c697d7eef60\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.138443 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.150521 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.162877 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.173397 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.185026 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.190996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.191063 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.191081 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.191106 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.191122 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.205427 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.234017 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.246122 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.262688 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.277052 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.277104 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.293139 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.293766 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.293796 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.293807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.293821 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.293832 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: W0127 11:20:58.305481 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod722c4ef1_b8ec_4732_908b_4c697d7eef60.slice/crio-b1def4c01e4a0ac6a8957def6b7649bc32727987ca64717d4772b4fdd26da4af WatchSource:0}: Error finding container b1def4c01e4a0ac6a8957def6b7649bc32727987ca64717d4772b4fdd26da4af: Status 404 returned error can't find the container with id b1def4c01e4a0ac6a8957def6b7649bc32727987ca64717d4772b4fdd26da4af Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.306592 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.323866 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.336059 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.349705 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.399296 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.405329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.405390 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.405404 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.405428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.405443 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.419008 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.428893 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.442866 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.453086 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:58Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.508563 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.508615 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.508633 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.508657 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.508674 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.611228 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.611275 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.611289 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.611305 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.611318 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.704390 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 11:43:11.587026499 +0000 UTC Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.713574 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.713629 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.713641 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.713664 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.713679 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.744953 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:20:58 crc kubenswrapper[4775]: E0127 11:20:58.745274 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.744971 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:20:58 crc kubenswrapper[4775]: E0127 11:20:58.745395 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.745664 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:20:58 crc kubenswrapper[4775]: E0127 11:20:58.745871 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.816134 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.816170 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.816180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.816194 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.816203 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.918837 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.918895 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.918915 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.918981 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:58 crc kubenswrapper[4775]: I0127 11:20:58.919004 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:58Z","lastTransitionTime":"2026-01-27T11:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.015133 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" event={"ID":"722c4ef1-b8ec-4732-908b-4c697d7eef60","Type":"ContainerStarted","Data":"971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.015192 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" event={"ID":"722c4ef1-b8ec-4732-908b-4c697d7eef60","Type":"ContainerStarted","Data":"b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.015209 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" event={"ID":"722c4ef1-b8ec-4732-908b-4c697d7eef60","Type":"ContainerStarted","Data":"b1def4c01e4a0ac6a8957def6b7649bc32727987ca64717d4772b4fdd26da4af"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.018420 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/1.log" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.021691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.021724 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.021735 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.021750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.021760 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.027925 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.034208 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.054268 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.085496 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.093780 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-b48nk"] Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.094234 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:20:59 crc kubenswrapper[4775]: E0127 11:20:59.094304 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.100225 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.114515 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.123748 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5frt\" (UniqueName: \"kubernetes.io/projected/c945c8b1-655c-4522-b703-0c5b9b8fcf38-kube-api-access-m5frt\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.123862 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.124901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.124952 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.124969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.124993 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.125008 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.129053 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.140049 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.152477 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.167883 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.182653 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.196310 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.207965 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.221187 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.224574 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.224612 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5frt\" (UniqueName: \"kubernetes.io/projected/c945c8b1-655c-4522-b703-0c5b9b8fcf38-kube-api-access-m5frt\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:20:59 crc kubenswrapper[4775]: E0127 11:20:59.224741 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:20:59 crc kubenswrapper[4775]: E0127 11:20:59.224805 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:20:59.724788057 +0000 UTC m=+38.866385844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.226880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.226918 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.226931 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.226949 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.226960 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.232595 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.240977 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.246720 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5frt\" (UniqueName: \"kubernetes.io/projected/c945c8b1-655c-4522-b703-0c5b9b8fcf38-kube-api-access-m5frt\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.255679 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.269106 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.284573 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.297536 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.314370 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.329626 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.329666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.329678 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.329702 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.329717 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.349697 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.359860 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.373283 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.384873 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.395794 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.406687 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.419659 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.431487 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.431543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.431555 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.431572 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.431586 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.436011 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.449361 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.466024 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.478138 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:59Z is after 2025-08-24T17:21:41Z" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.534216 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.534263 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.534277 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.534295 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.534307 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.637329 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.637400 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.637418 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.637449 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.637494 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.704767 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 02:28:59.817281931 +0000 UTC Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.729771 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:20:59 crc kubenswrapper[4775]: E0127 11:20:59.729919 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:20:59 crc kubenswrapper[4775]: E0127 11:20:59.729993 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:00.729974002 +0000 UTC m=+39.871571809 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.740428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.740515 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.740533 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.740563 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.740581 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.842498 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.842563 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.842580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.842604 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.842626 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.945909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.945969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.945991 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.946022 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:20:59 crc kubenswrapper[4775]: I0127 11:20:59.946045 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:20:59Z","lastTransitionTime":"2026-01-27T11:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.048654 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.048720 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.048743 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.048777 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.048800 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.151065 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.151125 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.151141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.151167 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.151183 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.253754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.253820 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.253834 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.253850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.253864 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.356989 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.357079 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.357097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.357122 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.357141 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.436928 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.437065 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.437198 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:21:16.437159884 +0000 UTC m=+55.578757701 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.437221 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.437291 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:16.437269157 +0000 UTC m=+55.578867034 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.437284 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.437368 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.437426 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:16.43741329 +0000 UTC m=+55.579011097 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.460790 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.460858 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.460879 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.460904 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.460922 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.538240 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.538339 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538509 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538550 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538567 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538570 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538597 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538617 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538634 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:16.538614604 +0000 UTC m=+55.680212391 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.538684 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:16.538662295 +0000 UTC m=+55.680260102 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.563975 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.564043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.564062 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.564090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.564109 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.668847 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.669116 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.669187 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.669310 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.669407 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.705633 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 23:37:52.188041059 +0000 UTC Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.740949 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.741125 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.741200 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:02.741177355 +0000 UTC m=+41.882775172 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.744845 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.744954 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.745047 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.745078 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.745161 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.745166 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.745242 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:00 crc kubenswrapper[4775]: E0127 11:21:00.745379 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.772772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.772849 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.772875 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.772906 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.772932 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.876994 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.877093 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.877111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.877135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.877154 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.979719 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.979784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.979801 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.979825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:00 crc kubenswrapper[4775]: I0127 11:21:00.979841 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:00Z","lastTransitionTime":"2026-01-27T11:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.082758 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.082811 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.082855 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.082880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.082898 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.185994 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.186092 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.186113 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.186137 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.186154 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.288850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.288895 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.288906 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.288928 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.288940 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.392053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.392113 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.392125 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.392144 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.392155 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.495231 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.495286 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.495297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.495314 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.495333 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.597609 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.597958 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.597974 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.597991 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.598001 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.640754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.640849 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.640868 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.640893 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.640910 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: E0127 11:21:01.662237 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.667199 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.667249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.667265 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.667290 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.667309 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: E0127 11:21:01.687312 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.691368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.691418 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.691443 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.691521 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.691546 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.706445 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 19:08:41.491120866 +0000 UTC Jan 27 11:21:01 crc kubenswrapper[4775]: E0127 11:21:01.710921 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.714439 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.714547 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.714565 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.714592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.714611 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: E0127 11:21:01.732559 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.736398 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.736426 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.736434 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.736451 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.736474 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: E0127 11:21:01.752689 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: E0127 11:21:01.752838 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.754071 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.754112 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.754124 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.754139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.754150 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.759302 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.773542 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.785701 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.800853 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.811608 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.823171 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.834699 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.853600 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.855702 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.855746 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.855755 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.855771 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.855779 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.866836 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.883338 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.895379 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.908618 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.921929 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.934181 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.943814 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.953653 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:01Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.958096 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.958127 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.958135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.958149 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:01 crc kubenswrapper[4775]: I0127 11:21:01.958159 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:01Z","lastTransitionTime":"2026-01-27T11:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.060129 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.060193 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.060215 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.060242 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.060295 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.162894 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.162929 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.162938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.162972 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.162982 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.264789 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.264840 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.264851 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.264869 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.264879 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.368163 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.368207 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.368225 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.368247 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.368264 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.470274 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.470561 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.470639 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.470708 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.470769 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.574114 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.574509 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.574688 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.574842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.574980 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.678146 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.678230 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.678281 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.678308 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.678330 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.707484 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 22:44:13.208037678 +0000 UTC Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.744933 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:02 crc kubenswrapper[4775]: E0127 11:21:02.745372 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.744993 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:02 crc kubenswrapper[4775]: E0127 11:21:02.745670 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.744948 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:02 crc kubenswrapper[4775]: E0127 11:21:02.745912 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.745017 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:02 crc kubenswrapper[4775]: E0127 11:21:02.746227 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.762990 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:02 crc kubenswrapper[4775]: E0127 11:21:02.763151 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:02 crc kubenswrapper[4775]: E0127 11:21:02.763228 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:06.763205481 +0000 UTC m=+45.904803258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.780936 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.781121 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.781209 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.781297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.781416 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.884111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.884162 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.884179 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.884202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.884220 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.988083 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.988120 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.988130 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.988142 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:02 crc kubenswrapper[4775]: I0127 11:21:02.988150 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:02Z","lastTransitionTime":"2026-01-27T11:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.091381 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.091446 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.091503 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.091525 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.091542 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.194410 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.194486 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.194504 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.194529 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.194547 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.297385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.297426 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.297436 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.297470 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.297480 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.400387 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.400446 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.400509 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.400539 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.400559 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.503326 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.503416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.503442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.503514 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.503539 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.606972 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.607048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.607066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.607091 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.607117 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.708043 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-06 19:53:57.558113556 +0000 UTC Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.709783 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.709816 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.709828 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.709844 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.709856 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.812621 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.812676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.812698 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.812727 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.812750 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.915746 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.915828 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.915852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.915882 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:03 crc kubenswrapper[4775]: I0127 11:21:03.915907 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:03Z","lastTransitionTime":"2026-01-27T11:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.018784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.018839 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.018851 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.018870 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.018882 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.121779 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.121856 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.121875 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.121905 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.121923 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.224550 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.224601 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.224619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.224641 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.224659 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.327598 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.327633 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.327643 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.327660 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.327670 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.430966 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.431017 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.431028 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.431045 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.431056 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.533321 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.533377 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.533394 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.533416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.533433 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.636010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.636077 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.636095 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.636119 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.636136 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.708618 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 11:14:20.711132414 +0000 UTC Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.738607 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.738633 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.738647 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.738660 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.738668 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.744128 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.744146 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.744470 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.744510 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:04 crc kubenswrapper[4775]: E0127 11:21:04.744558 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:04 crc kubenswrapper[4775]: E0127 11:21:04.744746 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:04 crc kubenswrapper[4775]: E0127 11:21:04.744950 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:04 crc kubenswrapper[4775]: E0127 11:21:04.745024 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.746078 4775 scope.go:117] "RemoveContainer" containerID="96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.841714 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.841744 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.841753 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.841768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.841778 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.944041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.944070 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.944083 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.944098 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:04 crc kubenswrapper[4775]: I0127 11:21:04.944107 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:04Z","lastTransitionTime":"2026-01-27T11:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.045988 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.046036 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.046054 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.046074 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.046090 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.050098 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.051329 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.051593 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.067405 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.078926 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.088630 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.099464 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.108066 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.127320 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.145010 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.148600 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.148635 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.148646 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.148661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.148672 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.167070 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.179010 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.190654 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.205022 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.217357 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.228926 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.239846 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.250538 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.250569 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.250581 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.250597 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.250608 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.253296 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.266009 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:05Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.353304 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.353368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.353391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.353418 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.353438 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.456120 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.456177 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.456194 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.456219 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.456236 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.559031 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.559064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.559073 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.559088 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.559098 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.662055 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.662124 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.662142 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.662166 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.662186 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.709345 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 19:57:16.405542846 +0000 UTC Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.763892 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.763961 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.763977 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.763997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.764015 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.866999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.867045 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.867054 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.867071 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.867082 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.969812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.969870 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.969909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.969953 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:05 crc kubenswrapper[4775]: I0127 11:21:05.969964 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:05Z","lastTransitionTime":"2026-01-27T11:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.072166 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.072224 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.072241 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.072291 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.072308 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.174332 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.174371 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.174381 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.174396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.174407 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.276308 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.276366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.276382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.276405 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.276422 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.378983 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.379015 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.379023 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.379034 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.379043 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.480527 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.480559 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.480567 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.480580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.480588 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.582935 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.582973 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.582983 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.582996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.583005 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.685584 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.685624 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.685636 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.685652 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.685664 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.710203 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 01:23:09.091822811 +0000 UTC Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.744690 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.744722 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.744762 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.744708 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:06 crc kubenswrapper[4775]: E0127 11:21:06.744815 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:06 crc kubenswrapper[4775]: E0127 11:21:06.744883 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:06 crc kubenswrapper[4775]: E0127 11:21:06.744953 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:06 crc kubenswrapper[4775]: E0127 11:21:06.745010 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.788947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.788992 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.789001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.789015 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.789025 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.806937 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:06 crc kubenswrapper[4775]: E0127 11:21:06.807163 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:06 crc kubenswrapper[4775]: E0127 11:21:06.807267 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:14.807239912 +0000 UTC m=+53.948837729 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.891408 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.891443 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.891468 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.891482 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.891490 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.993631 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.993954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.994147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.994361 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:06 crc kubenswrapper[4775]: I0127 11:21:06.994561 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:06Z","lastTransitionTime":"2026-01-27T11:21:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.097925 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.097992 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.098000 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.098012 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.098021 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.201640 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.201680 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.201693 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.201709 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.201719 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.305440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.305626 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.305657 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.305681 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.305698 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.409293 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.409370 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.409395 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.409424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.409443 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.513619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.513696 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.513718 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.513745 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.513766 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.616699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.616792 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.616820 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.616854 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.616877 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.710796 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 12:16:38.206933187 +0000 UTC Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.719944 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.720004 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.720023 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.720048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.720065 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.822606 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.822695 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.822712 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.822736 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.822753 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.925702 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.925779 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.925802 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.925832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:07 crc kubenswrapper[4775]: I0127 11:21:07.925855 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:07Z","lastTransitionTime":"2026-01-27T11:21:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.028771 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.028843 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.028861 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.028891 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.028927 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.132407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.132560 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.132630 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.132668 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.132738 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.235420 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.235516 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.235540 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.235568 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.235588 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.338668 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.338734 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.338751 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.338773 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.338790 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.442114 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.442202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.442220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.442242 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.442261 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.544683 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.544751 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.544774 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.544805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.544832 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.648236 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.648365 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.648389 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.648419 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.648438 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.711862 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 12:50:05.020245952 +0000 UTC Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.744617 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.744691 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.744762 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:08 crc kubenswrapper[4775]: E0127 11:21:08.744796 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.744653 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:08 crc kubenswrapper[4775]: E0127 11:21:08.744986 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:08 crc kubenswrapper[4775]: E0127 11:21:08.745077 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:08 crc kubenswrapper[4775]: E0127 11:21:08.745287 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.751603 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.751656 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.751707 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.751772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.751792 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.854940 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.855000 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.855018 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.855042 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.855060 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.957912 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.957990 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.958009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.958029 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:08 crc kubenswrapper[4775]: I0127 11:21:08.958043 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:08Z","lastTransitionTime":"2026-01-27T11:21:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.061368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.061442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.061497 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.061525 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.061542 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.163947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.163998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.164008 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.164024 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.164032 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.267255 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.267347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.267366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.267391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.267410 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.369998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.370045 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.370057 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.370075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.370087 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.472923 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.473044 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.473067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.473140 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.473159 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.576287 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.576417 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.576444 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.576510 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.576532 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.679997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.680063 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.680083 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.680108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.680126 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.712654 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 12:17:45.66238471 +0000 UTC Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.781999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.782066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.782076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.782092 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.782103 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.885071 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.885143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.885161 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.885732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.885795 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.988373 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.988421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.988432 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.988475 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:09 crc kubenswrapper[4775]: I0127 11:21:09.988488 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:09Z","lastTransitionTime":"2026-01-27T11:21:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.090798 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.090859 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.090877 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.090901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.090920 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.194007 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.194103 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.194152 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.194185 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.194208 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.296985 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.297042 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.297055 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.297074 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.297087 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.400282 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.400340 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.400351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.400368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.400382 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.502941 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.503061 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.503080 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.503104 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.503122 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.606256 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.606313 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.606327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.606344 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.606356 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.708643 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.708703 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.708718 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.708743 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.708760 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.713203 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 01:19:07.359072324 +0000 UTC Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.744613 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.744662 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.744709 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.744678 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:10 crc kubenswrapper[4775]: E0127 11:21:10.744833 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:10 crc kubenswrapper[4775]: E0127 11:21:10.744978 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:10 crc kubenswrapper[4775]: E0127 11:21:10.745060 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:10 crc kubenswrapper[4775]: E0127 11:21:10.745172 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.811747 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.811783 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.811792 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.811806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.811818 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.914259 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.914353 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.914391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.914426 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:10 crc kubenswrapper[4775]: I0127 11:21:10.914489 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:10Z","lastTransitionTime":"2026-01-27T11:21:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.017525 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.017616 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.017634 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.017663 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.017682 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.120382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.120424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.120432 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.120444 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.120471 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.222705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.222735 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.222744 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.222757 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.222767 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.325577 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.325642 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.325659 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.325685 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.325702 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.428974 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.429040 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.429056 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.429079 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.429097 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.453944 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.466753 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.478773 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.498942 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.521006 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.531007 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.531043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.531052 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.531069 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.531079 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.535109 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.547238 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.556614 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.567389 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.578852 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.600106 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.621873 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.633786 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.633825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.633836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.633853 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.633865 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.645032 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.672020 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.692647 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.713704 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.713666 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 13:20:52.385902479 +0000 UTC Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.727223 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.736638 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.736671 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.736684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.736699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.736709 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.741506 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.764465 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.783281 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.799167 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.828956 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0caeedf1fbc0970bde0301e5a2177767e683a1ad236e1e61c7470a5fc3ba80ba\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:55Z\\\",\\\"message\\\":\\\"topping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.572809 6022 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573154 6022 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573340 6022 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0127 11:20:55.573355 6022 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0127 11:20:55.573760 6022 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.573785 6022 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0127 11:20:55.573884 6022 factory.go:656] Stopping watch factory\\\\nI0127 11:20:55.573893 6022 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 11:20:55.574045 6022 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.839089 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.839145 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.839163 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.839190 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.839208 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.843586 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.847646 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.847686 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.847701 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.847718 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.847852 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.860947 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: E0127 11:21:11.868361 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.872082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.872147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.872161 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.872177 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.872189 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.876668 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: E0127 11:21:11.886215 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.889884 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.889927 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.889939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.889954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.889965 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.892565 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.904147 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: E0127 11:21:11.911730 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.916602 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.916632 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.916647 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.916666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.916685 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.917688 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: E0127 11:21:11.930198 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.934759 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.934836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.934850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.934866 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.934903 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.936007 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: E0127 11:21:11.954803 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: E0127 11:21:11.954992 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.956939 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.957269 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.957297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.957311 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.957330 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.957345 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:11Z","lastTransitionTime":"2026-01-27T11:21:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.968878 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.983609 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:11 crc kubenswrapper[4775]: I0127 11:21:11.996381 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:11Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.007588 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.024641 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.060347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.060402 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.060416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.060475 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.060488 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.162763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.162799 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.162807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.162821 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.162830 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.264929 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.264977 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.264990 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.265007 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.265019 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.367116 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.367159 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.367171 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.367185 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.367194 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.471024 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.471086 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.471109 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.471137 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.471158 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.573960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.574019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.574043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.574073 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.574097 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.676613 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.676688 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.676700 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.676722 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.676764 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.714406 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 16:58:39.123852364 +0000 UTC Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.743995 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:12 crc kubenswrapper[4775]: E0127 11:21:12.744129 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.744158 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.744213 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.744022 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:12 crc kubenswrapper[4775]: E0127 11:21:12.744582 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:12 crc kubenswrapper[4775]: E0127 11:21:12.744681 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:12 crc kubenswrapper[4775]: E0127 11:21:12.744708 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.744867 4775 scope.go:117] "RemoveContainer" containerID="437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.763864 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.779177 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.779594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.779812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.780023 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.780257 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.790676 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.814001 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.834287 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.849616 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.863268 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.881996 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.883178 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.883211 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.883222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.883239 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.883251 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.905522 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.926279 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.940132 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.952706 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.974274 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.986567 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.986598 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.986606 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.986619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.986629 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:12Z","lastTransitionTime":"2026-01-27T11:21:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:12 crc kubenswrapper[4775]: I0127 11:21:12.989023 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:12Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.005684 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.022285 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.033590 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.050527 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.080725 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/1.log" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.083242 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.083889 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.084020 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.089039 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.089327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.089337 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.089349 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.089361 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.108418 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.121037 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.145293 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.176677 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.191066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.191112 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.191128 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.191146 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.191157 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.194527 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.206509 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.225424 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.240482 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.261793 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.278621 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.293525 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.293574 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.293588 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.293605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.293616 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.303312 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.324547 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.337656 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.350696 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.367146 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.379636 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.391346 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.395750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.395779 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.395790 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.395805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.395817 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.497931 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.497962 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.497971 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.497989 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.497999 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.600080 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.600150 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.600175 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.600204 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.600228 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.702416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.702514 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.702533 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.702556 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.702576 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.715195 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 16:32:51.731319433 +0000 UTC Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.806518 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.806624 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.806666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.806698 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.806717 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.807670 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.910326 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.910395 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.910415 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.910440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:13 crc kubenswrapper[4775]: I0127 11:21:13.910485 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:13Z","lastTransitionTime":"2026-01-27T11:21:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.013067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.013128 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.013145 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.013173 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.013192 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.090239 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/2.log" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.091428 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/1.log" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.095191 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.096217 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed" exitCode=1 Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.096304 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.096354 4775 scope.go:117] "RemoveContainer" containerID="437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.097831 4775 scope.go:117] "RemoveContainer" containerID="d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed" Jan 27 11:21:14 crc kubenswrapper[4775]: E0127 11:21:14.098167 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.114702 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.116193 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.116252 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.116275 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.116303 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.116325 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.134000 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.148147 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.167100 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.189624 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.208266 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.219257 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.219303 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.219317 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.219339 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.219353 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.227801 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.247390 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.274548 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://437cf073f81e633a016279788896ce5d1b712d5df05c3c26edafd28cba3edd79\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"message\\\":\\\"aler-operator per-node LB for network=default: []services.LB{}\\\\nI0127 11:20:57.255309 6185 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-nzthg\\\\nI0127 11:20:57.255316 6185 services_controller.go:453] Built service openshift-machine-api/cluster-autoscaler-operator template LB for network=default: []services.LB{}\\\\nF0127 11:20:57.255303 6185 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:20:57Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:20:57.255326 6185 services_controller.go:454] Service openshift-machine-api/cluster-autoscaler-operator for network=default has 2 cluster-wide, 0 per\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.291620 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.313683 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.322981 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.323021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.323030 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.323044 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.323055 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.335179 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.353648 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.368190 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.383806 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.399865 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.416787 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:14Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.426068 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.426391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.426662 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.426877 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.427064 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.529834 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.530082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.530165 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.530249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.530344 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.633325 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.633373 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.633385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.633403 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.633414 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.715902 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 14:15:52.257288807 +0000 UTC Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.736101 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.736151 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.736169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.736193 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.736212 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.744329 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.744366 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:14 crc kubenswrapper[4775]: E0127 11:21:14.744483 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.744501 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:14 crc kubenswrapper[4775]: E0127 11:21:14.744797 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:14 crc kubenswrapper[4775]: E0127 11:21:14.745037 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.745073 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:14 crc kubenswrapper[4775]: E0127 11:21:14.745377 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.839420 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.839544 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.839572 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.839605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.839632 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.894888 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:14 crc kubenswrapper[4775]: E0127 11:21:14.895697 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:14 crc kubenswrapper[4775]: E0127 11:21:14.895856 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:30.895809147 +0000 UTC m=+70.037406964 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.942712 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.942823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.942845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.942875 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:14 crc kubenswrapper[4775]: I0127 11:21:14.942898 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:14Z","lastTransitionTime":"2026-01-27T11:21:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.046064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.046118 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.046135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.046294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.046373 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.102419 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/2.log" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.106311 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.108571 4775 scope.go:117] "RemoveContainer" containerID="d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed" Jan 27 11:21:15 crc kubenswrapper[4775]: E0127 11:21:15.108794 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.127386 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.143219 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.148798 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.148833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.148856 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.148871 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.148881 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.157239 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.173738 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.186264 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.202744 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.219551 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.230954 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.244778 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.251186 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.251265 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.251288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.251319 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.251342 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.256744 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.272926 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.289947 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.305421 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.318173 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.330267 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.343870 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.353478 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.353514 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.353524 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.353540 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.353550 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.359302 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:15Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.455896 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.455930 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.455938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.455953 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.455963 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.558402 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.558473 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.558490 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.558548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.558568 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.661134 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.661161 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.661169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.661183 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.661193 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.716226 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 15:35:52.704164601 +0000 UTC Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.764027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.764068 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.764080 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.764101 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.764113 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.900084 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.900126 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.900139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.900171 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:15 crc kubenswrapper[4775]: I0127 11:21:15.900184 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:15Z","lastTransitionTime":"2026-01-27T11:21:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.003899 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.003953 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.003965 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.003985 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.003999 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.107761 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.107793 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.107804 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.107820 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.107833 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.210963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.211023 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.211043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.211067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.211082 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.314296 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.314332 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.314342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.314356 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.314369 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.416792 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.416835 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.416843 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.416857 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.416868 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.517040 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.517149 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.517216 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.517443 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.517609 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:21:48.517568252 +0000 UTC m=+87.659166089 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.517665 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:48.517646894 +0000 UTC m=+87.659244761 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.517790 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.517848 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:48.517833968 +0000 UTC m=+87.659431745 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.519251 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.519318 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.519339 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.519368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.519389 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.618065 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.618241 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618412 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618416 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618541 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618566 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618508 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618652 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618692 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:48.618663252 +0000 UTC m=+87.760261069 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.618798 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:21:48.618779025 +0000 UTC m=+87.760376852 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.621679 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.621715 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.621732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.621754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.621771 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.717102 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 07:30:55.763514847 +0000 UTC Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.724019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.724060 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.724184 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.724208 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.724297 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.744882 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.744924 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.744882 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.745001 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.744924 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.745141 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.745266 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:16 crc kubenswrapper[4775]: E0127 11:21:16.745410 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.826915 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.826978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.826996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.827017 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.827032 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.929722 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.929794 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.929812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.929846 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:16 crc kubenswrapper[4775]: I0127 11:21:16.929869 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:16Z","lastTransitionTime":"2026-01-27T11:21:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.032342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.032382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.032391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.032405 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.032421 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.134924 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.134969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.134978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.134992 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.135001 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.238098 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.238151 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.238169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.238186 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.238196 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.341118 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.341160 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.341169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.341183 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.341193 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.444283 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.444343 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.444360 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.444385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.444401 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.547510 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.547555 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.547563 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.547579 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.547592 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.650583 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.650627 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.650643 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.650661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.650673 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.717379 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 23:03:34.50270977 +0000 UTC Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.752433 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.752478 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.752487 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.752498 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.752507 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.855217 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.855261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.855272 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.855288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.855299 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.957618 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.957681 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.957699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.957716 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:17 crc kubenswrapper[4775]: I0127 11:21:17.957733 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:17Z","lastTransitionTime":"2026-01-27T11:21:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.059953 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.059990 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.059998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.060010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.060020 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.162761 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.162822 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.162832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.162844 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.162853 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.265374 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.265605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.265684 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.265766 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.265853 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.368705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.368769 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.368787 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.368811 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.368828 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.471421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.471510 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.471528 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.471554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.471571 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.574723 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.574773 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.574789 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.574811 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.574829 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.678196 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.678299 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.678318 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.678341 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.678359 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.718005 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 11:49:15.387990095 +0000 UTC Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.744736 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.744765 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.744765 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.744891 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:18 crc kubenswrapper[4775]: E0127 11:21:18.745104 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:18 crc kubenswrapper[4775]: E0127 11:21:18.745302 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:18 crc kubenswrapper[4775]: E0127 11:21:18.745418 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:18 crc kubenswrapper[4775]: E0127 11:21:18.745534 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.781901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.781960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.781978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.782001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.782018 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.885164 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.885227 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.885249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.885271 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.885287 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.988685 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.988761 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.988780 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.988810 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:18 crc kubenswrapper[4775]: I0127 11:21:18.988830 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:18Z","lastTransitionTime":"2026-01-27T11:21:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.092246 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.092306 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.092324 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.092347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.092367 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.195248 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.195384 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.195405 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.195428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.195473 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.297994 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.298038 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.298048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.298066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.298076 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.401027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.401066 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.401074 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.401093 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.401102 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.503603 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.503652 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.503662 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.503675 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.503685 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.607078 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.607134 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.607151 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.607177 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.607193 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.710075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.710126 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.710139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.710156 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.710168 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.718653 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 10:26:13.43631159 +0000 UTC Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.813723 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.813777 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.813807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.813829 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.813844 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.917291 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.917355 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.917373 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.917397 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:19 crc kubenswrapper[4775]: I0127 11:21:19.917414 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:19Z","lastTransitionTime":"2026-01-27T11:21:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.020381 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.020541 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.020561 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.020592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.020623 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.124109 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.124157 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.124169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.124185 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.124198 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.227826 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.227922 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.227938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.227963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.227979 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.331386 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.331488 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.331517 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.331547 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.331568 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.434732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.434800 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.434823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.434852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.434872 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.538561 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.538619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.538631 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.538657 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.538670 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.645521 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.645613 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.645641 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.645676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.645762 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.719378 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 02:35:39.948251203 +0000 UTC Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.744745 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.744930 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.744983 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:20 crc kubenswrapper[4775]: E0127 11:21:20.745226 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.745289 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:20 crc kubenswrapper[4775]: E0127 11:21:20.745536 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:20 crc kubenswrapper[4775]: E0127 11:21:20.746054 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:20 crc kubenswrapper[4775]: E0127 11:21:20.746334 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.749063 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.749117 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.749135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.749157 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.749176 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.852053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.852143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.852170 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.852204 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.852229 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.955311 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.955370 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.955390 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.955427 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:20 crc kubenswrapper[4775]: I0127 11:21:20.955476 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:20Z","lastTransitionTime":"2026-01-27T11:21:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.058327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.058381 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.058399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.058421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.058439 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.161246 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.161320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.161337 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.161363 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.161382 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.264035 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.264091 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.264108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.264131 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.264150 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.366924 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.366995 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.367020 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.367053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.367080 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.470194 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.470234 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.470245 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.470262 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.470274 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.573956 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.574082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.574110 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.574141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.574165 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.677876 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.677954 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.677978 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.678013 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.678057 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.719650 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 17:46:36.605535352 +0000 UTC Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.767705 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.781401 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.781465 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.781477 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.781494 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.781505 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.798269 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.820359 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.838009 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.861374 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.882358 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.884363 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.884410 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.884424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.884474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.884493 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.901056 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.916284 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.930878 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.944347 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.959606 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.976320 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.987963 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.987996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.988009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.988026 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.988037 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:21Z","lastTransitionTime":"2026-01-27T11:21:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:21 crc kubenswrapper[4775]: I0127 11:21:21.994196 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:21Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.013812 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.042663 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.055620 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.068783 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.091059 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.091094 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.091131 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.091154 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.091172 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.113007 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.130751 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.144495 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.158379 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.177300 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.192160 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.193331 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.193375 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.193388 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.193410 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.193427 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.206162 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.209389 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.209432 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.209474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.209496 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.209511 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.225791 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.235891 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.241438 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.241555 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.241581 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.241611 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.241633 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.248551 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.260833 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.263780 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.266105 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.266167 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.266188 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.266218 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.266239 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.281343 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.282208 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.286898 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.286935 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.286944 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.286956 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.286964 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.303443 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.307588 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.311948 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.311973 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.311982 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.311994 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.312002 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.319950 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.328812 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.328954 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.330947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.330980 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.330993 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.331010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.331023 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.331665 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.344260 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.358191 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.374548 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.384399 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:22Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.433268 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.433309 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.433321 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.433337 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.433348 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.535526 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.535578 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.535589 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.535606 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.535938 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.639114 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.639179 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.639197 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.639220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.639240 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.720711 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 15:05:43.439728348 +0000 UTC Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.742572 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.742614 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.742625 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.742638 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.742647 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.744934 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.745076 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.745084 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.745251 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.745259 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.745375 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.745507 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:22 crc kubenswrapper[4775]: E0127 11:21:22.745625 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.846114 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.846187 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.846207 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.846234 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.846253 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.949258 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.949325 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.949342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.949366 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:22 crc kubenswrapper[4775]: I0127 11:21:22.949384 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:22Z","lastTransitionTime":"2026-01-27T11:21:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.053362 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.053440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.053548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.053583 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.053606 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.156928 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.157483 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.157850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.158151 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.158731 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.262202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.262271 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.262289 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.262314 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.262333 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.365782 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.365847 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.365883 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.365914 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.365936 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.468396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.468434 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.468444 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.468483 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.468495 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.571571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.571619 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.571636 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.571659 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.571675 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.675025 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.675086 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.675109 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.675140 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.675161 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.721125 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 16:35:40.50168365 +0000 UTC Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.777823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.777918 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.777936 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.777960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.777978 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.880212 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.880261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.880273 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.880293 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.880306 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.982296 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.982391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.982410 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.982437 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:23 crc kubenswrapper[4775]: I0127 11:21:23.982492 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:23Z","lastTransitionTime":"2026-01-27T11:21:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.084544 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.084599 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.084620 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.084644 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.084665 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.187174 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.187249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.187266 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.187289 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.187306 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.291735 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.291808 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.291830 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.291861 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.291884 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.394504 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.394545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.394602 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.394617 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.394628 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.497317 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.497355 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.497365 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.497380 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.497391 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.599775 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.599812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.599825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.599841 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.599852 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.702750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.702809 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.702832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.702856 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.702875 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.721861 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 03:42:45.84616998 +0000 UTC Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.744442 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:24 crc kubenswrapper[4775]: E0127 11:21:24.744725 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.745037 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:24 crc kubenswrapper[4775]: E0127 11:21:24.745165 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.745711 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:24 crc kubenswrapper[4775]: E0127 11:21:24.745845 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.745953 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:24 crc kubenswrapper[4775]: E0127 11:21:24.746083 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.806001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.806069 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.806087 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.806119 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.806138 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.909042 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.909086 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.909097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.909114 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:24 crc kubenswrapper[4775]: I0127 11:21:24.909127 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:24Z","lastTransitionTime":"2026-01-27T11:21:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.012407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.012490 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.012512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.012544 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.012566 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.115072 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.115141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.115152 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.115167 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.115177 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.217182 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.217221 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.217231 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.217253 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.217262 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.319943 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.319998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.320017 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.320041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.320061 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.423210 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.423255 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.423264 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.423284 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.423294 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.526045 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.526082 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.526090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.526104 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.526113 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.628262 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.628298 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.628309 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.628327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.628341 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.722246 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 02:30:41.484704045 +0000 UTC Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.731189 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.731219 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.731229 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.731245 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.731257 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.833545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.833585 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.833596 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.833610 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.833622 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.967004 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.967261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.967327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.967407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:25 crc kubenswrapper[4775]: I0127 11:21:25.967480 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:25Z","lastTransitionTime":"2026-01-27T11:21:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.070176 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.070568 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.070736 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.070865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.071005 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.172828 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.172853 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.172861 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.172874 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.172883 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.275031 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.275078 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.275096 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.275118 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.275135 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.378141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.378190 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.378203 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.378222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.378275 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.481297 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.481351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.481368 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.481390 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.481407 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.582968 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.582996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.583004 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.583015 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.583024 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.685508 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.685574 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.685594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.685621 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.685638 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.723259 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 16:20:44.641596518 +0000 UTC Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.744677 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.744779 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.744801 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:26 crc kubenswrapper[4775]: E0127 11:21:26.744836 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:26 crc kubenswrapper[4775]: E0127 11:21:26.744961 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:26 crc kubenswrapper[4775]: E0127 11:21:26.745129 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.745156 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:26 crc kubenswrapper[4775]: E0127 11:21:26.745765 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.746156 4775 scope.go:117] "RemoveContainer" containerID="d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed" Jan 27 11:21:26 crc kubenswrapper[4775]: E0127 11:21:26.746399 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.788390 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.788471 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.788491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.788512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.788528 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.892545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.892653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.892676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.892741 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.892759 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.996379 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.996520 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.996538 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.996559 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:26 crc kubenswrapper[4775]: I0127 11:21:26.996607 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:26Z","lastTransitionTime":"2026-01-27T11:21:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.098979 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.099046 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.099058 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.099075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.099087 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.201376 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.201649 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.201739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.201829 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.201924 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.304371 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.304609 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.304686 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.304772 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.304853 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.407836 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.407878 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.407888 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.407904 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.407918 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.510003 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.510044 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.510052 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.510068 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.510077 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.612180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.612221 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.612231 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.612248 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.612258 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.713941 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.713982 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.713990 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.714004 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.714014 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.724241 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 10:30:10.365808804 +0000 UTC Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.816393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.816492 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.816511 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.816536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.816553 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.918980 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.919035 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.919045 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.919060 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:27 crc kubenswrapper[4775]: I0127 11:21:27.919070 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:27Z","lastTransitionTime":"2026-01-27T11:21:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.021005 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.021057 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.021074 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.021097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.021118 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.126401 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.126500 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.126525 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.126554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.126578 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.229555 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.229600 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.229611 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.229627 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.229642 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.331913 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.331973 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.331986 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.332021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.332034 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.434589 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.434805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.435009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.435206 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.435377 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.538007 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.538076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.538099 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.538127 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.538148 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.640401 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.640532 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.640603 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.640633 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.640694 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.725291 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 06:11:26.865058265 +0000 UTC Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.743853 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.743907 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.743940 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.743862 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.744011 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.744033 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.744047 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.744059 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: E0127 11:21:28.744001 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.744079 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: E0127 11:21:28.744128 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:28 crc kubenswrapper[4775]: E0127 11:21:28.744280 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:28 crc kubenswrapper[4775]: E0127 11:21:28.744405 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.846825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.846879 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.846887 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.846901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.846909 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.949568 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.949594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.949602 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.949615 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:28 crc kubenswrapper[4775]: I0127 11:21:28.949623 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:28Z","lastTransitionTime":"2026-01-27T11:21:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.052090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.052132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.052142 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.052156 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.052167 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.154881 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.154937 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.154948 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.154966 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.154978 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.257280 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.257334 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.257346 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.257362 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.257373 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.359998 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.360035 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.360043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.360058 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.360068 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.462312 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.462350 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.462362 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.462380 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.462394 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.564556 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.564587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.564598 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.564614 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.564625 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.667115 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.667202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.667218 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.667237 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.667250 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.725903 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 00:17:21.463243599 +0000 UTC Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.769845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.769886 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.769897 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.769913 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.769923 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.872671 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.872705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.872717 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.872736 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.872748 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.974832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.974865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.974873 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.974886 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:29 crc kubenswrapper[4775]: I0127 11:21:29.974896 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:29Z","lastTransitionTime":"2026-01-27T11:21:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.077320 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.077361 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.077373 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.077390 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.077404 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.179376 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.179414 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.179425 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.179440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.179465 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.281514 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.281560 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.281589 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.281606 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.281617 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.384056 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.384106 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.384123 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.384144 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.384163 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.486021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.486049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.486057 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.486069 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.486078 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.588202 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.588235 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.588243 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.588257 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.588265 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.690824 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.690859 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.690867 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.690881 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.690894 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.726364 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 02:02:28.127037361 +0000 UTC Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.744394 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.744436 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.744436 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:30 crc kubenswrapper[4775]: E0127 11:21:30.744513 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.744627 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:30 crc kubenswrapper[4775]: E0127 11:21:30.744639 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:30 crc kubenswrapper[4775]: E0127 11:21:30.744654 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:30 crc kubenswrapper[4775]: E0127 11:21:30.744708 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.793439 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.793505 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.793518 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.793536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.793552 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.896034 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.896135 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.896143 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.896158 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.896167 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:30Z","lastTransitionTime":"2026-01-27T11:21:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:30 crc kubenswrapper[4775]: I0127 11:21:30.906545 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:30 crc kubenswrapper[4775]: E0127 11:21:30.906709 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:30 crc kubenswrapper[4775]: E0127 11:21:30.906809 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:22:02.906778784 +0000 UTC m=+102.048376591 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.016009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.016048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.016057 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.016072 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.016081 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.118222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.118280 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.118296 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.118317 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.118333 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.220782 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.220842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.220865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.220891 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.220913 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.323186 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.323213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.323222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.323235 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.323244 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.425718 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.425776 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.425797 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.425823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.425844 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.528399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.528493 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.528516 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.528543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.528565 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.630880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.630919 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.630933 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.630951 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.630965 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.726835 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 16:24:59.347517571 +0000 UTC Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.733175 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.733225 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.733247 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.733276 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.733297 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.757824 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.767122 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.776315 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.786561 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.798500 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.810249 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.822717 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.836440 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.836497 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.836509 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.836523 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.836533 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.836959 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.865342 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.886114 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.905816 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.920590 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.931378 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.938655 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.938693 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.938705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.938723 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.938737 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:31Z","lastTransitionTime":"2026-01-27T11:21:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.940860 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.952906 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.964511 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:31 crc kubenswrapper[4775]: I0127 11:21:31.973299 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:31Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.041484 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.041518 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.041528 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.041543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.041554 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.143545 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.143579 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.143587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.143600 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.143610 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.245774 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.245805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.245813 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.245825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.245833 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.348196 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.348240 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.348251 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.348268 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.348279 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.450970 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.451010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.451020 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.451033 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.451043 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.553801 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.553849 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.553860 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.553876 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.553887 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.606214 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.606249 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.606258 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.606272 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.606282 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.621630 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:32Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.629982 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.630024 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.630041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.630064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.630084 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.651275 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:32Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.655399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.655546 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.655575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.655606 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.655642 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.677202 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:32Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.680653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.680704 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.680716 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.680730 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.680742 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.692042 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:32Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.695383 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.695443 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.695480 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.695501 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.695514 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.707866 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:32Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.708015 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.710017 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.710233 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.710242 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.710256 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.710266 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.727318 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 19:27:17.838835674 +0000 UTC Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.743883 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.743899 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.744013 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.744007 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.744062 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.744161 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.744381 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:32 crc kubenswrapper[4775]: E0127 11:21:32.744583 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.812288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.812321 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.812333 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.812348 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.812360 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.914660 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.914745 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.914777 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.914805 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:32 crc kubenswrapper[4775]: I0127 11:21:32.914826 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:32Z","lastTransitionTime":"2026-01-27T11:21:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.017002 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.017037 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.017050 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.017067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.017078 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.119497 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.119529 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.119538 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.119550 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.119559 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.222102 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.222331 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.222474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.222590 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.222682 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.325486 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.325529 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.325543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.325558 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.325569 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.428275 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.428319 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.428330 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.428347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.428360 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.531159 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.531227 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.531237 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.531277 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.531293 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.633220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.633254 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.633267 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.633281 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.633292 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.727921 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 16:40:24.523978253 +0000 UTC Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.735760 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.735885 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.735950 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.736010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.736066 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.838643 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.838683 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.838691 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.838706 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.838715 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.941288 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.941592 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.941658 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.941743 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:33 crc kubenswrapper[4775]: I0127 11:21:33.941801 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:33Z","lastTransitionTime":"2026-01-27T11:21:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.044223 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.044253 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.044264 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.044279 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.044287 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.146387 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.146419 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.146427 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.146439 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.146471 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.172627 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/0.log" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.172666 4775 generic.go:334] "Generic (PLEG): container finished" podID="aba2edc6-0e64-4995-830d-e177919ea13e" containerID="e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc" exitCode=1 Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.172690 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerDied","Data":"e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.172994 4775 scope.go:117] "RemoveContainer" containerID="e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.186886 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.197503 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.207423 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.220920 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.234922 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.248373 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.249947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.250050 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.250128 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.250213 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.250287 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.263386 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.275345 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.288612 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.298305 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.310120 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"2026-01-27T11:20:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940\\\\n2026-01-27T11:20:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940 to /host/opt/cni/bin/\\\\n2026-01-27T11:20:49Z [verbose] multus-daemon started\\\\n2026-01-27T11:20:49Z [verbose] Readiness Indicator file check\\\\n2026-01-27T11:21:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.324434 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.342206 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.352006 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.352489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.352524 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.352534 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.352549 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.352559 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.361912 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.374582 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.387854 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:34Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.454947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.454997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.455011 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.455028 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.455042 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.557524 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.557571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.557582 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.557601 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.557613 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.659705 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.659746 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.659754 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.659768 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.659779 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.728093 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 21:00:23.272324447 +0000 UTC Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.744020 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.744036 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.744101 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.744157 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:34 crc kubenswrapper[4775]: E0127 11:21:34.744302 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:34 crc kubenswrapper[4775]: E0127 11:21:34.744554 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:34 crc kubenswrapper[4775]: E0127 11:21:34.744717 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:34 crc kubenswrapper[4775]: E0127 11:21:34.744869 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.761833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.761863 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.761871 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.761884 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.761893 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.863912 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.863935 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.863946 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.863956 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.863964 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.966652 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.966956 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.967153 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.967310 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:34 crc kubenswrapper[4775]: I0127 11:21:34.967482 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:34Z","lastTransitionTime":"2026-01-27T11:21:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.070064 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.070110 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.070123 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.070140 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.070150 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.172993 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.173036 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.173049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.173068 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.173079 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.177090 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/0.log" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.177311 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerStarted","Data":"750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.192840 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.209772 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.224764 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.238749 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.255536 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.268927 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.275212 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.275257 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.275268 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.275285 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.275298 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.281779 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.291705 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.305238 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.315615 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.325648 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.338523 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.351089 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.366727 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.378671 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.378720 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.378732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.378752 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.378767 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.382821 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"2026-01-27T11:20:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940\\\\n2026-01-27T11:20:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940 to /host/opt/cni/bin/\\\\n2026-01-27T11:20:49Z [verbose] multus-daemon started\\\\n2026-01-27T11:20:49Z [verbose] Readiness Indicator file check\\\\n2026-01-27T11:21:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.400906 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.425818 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:35Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.480696 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.480739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.480749 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.480763 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.480772 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.582947 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.582986 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.582994 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.583007 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.583016 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.685312 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.685345 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.685353 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.685364 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.685373 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.729790 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 01:59:50.039709723 +0000 UTC Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.787854 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.787889 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.787901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.787917 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.787930 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.890753 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.890821 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.890845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.890936 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.890966 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.993968 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.994021 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.994029 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.994043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:35 crc kubenswrapper[4775]: I0127 11:21:35.994054 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:35Z","lastTransitionTime":"2026-01-27T11:21:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.096097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.096133 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.096141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.096155 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.096166 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.198169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.198227 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.198247 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.198264 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.198275 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.300858 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.300904 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.300916 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.300934 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.300946 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.403851 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.403896 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.403908 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.403928 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.403941 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.507791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.507847 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.507863 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.507887 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.507904 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.610315 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.610414 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.610435 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.610489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.610508 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.712999 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.713048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.713059 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.713076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.713091 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.730592 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 07:40:57.454856903 +0000 UTC Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.744915 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.744928 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:36 crc kubenswrapper[4775]: E0127 11:21:36.745251 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.744974 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.744962 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:36 crc kubenswrapper[4775]: E0127 11:21:36.745325 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:36 crc kubenswrapper[4775]: E0127 11:21:36.745265 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:36 crc kubenswrapper[4775]: E0127 11:21:36.745505 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.815016 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.815063 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.815078 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.815099 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.815112 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.917788 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.918070 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.918158 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.918253 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:36 crc kubenswrapper[4775]: I0127 11:21:36.918348 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:36Z","lastTransitionTime":"2026-01-27T11:21:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.020784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.020832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.020844 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.020885 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.020899 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.123338 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.123385 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.123396 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.123413 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.123427 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.225531 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.225580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.225596 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.225618 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.225635 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.332180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.332238 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.332269 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.332296 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.332314 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.435659 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.435732 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.435751 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.435776 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.435794 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.538520 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.538605 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.538629 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.538655 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.538674 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.641171 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.641261 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.641278 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.641301 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.641319 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.731630 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 18:55:35.433161811 +0000 UTC Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.744142 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.744199 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.744216 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.744239 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.744257 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.846882 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.846949 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.846969 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.846994 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.847012 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.949757 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.949793 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.949801 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.949814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:37 crc kubenswrapper[4775]: I0127 11:21:37.949823 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:37Z","lastTransitionTime":"2026-01-27T11:21:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.052653 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.052695 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.052704 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.052717 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.052729 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.155228 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.155265 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.155272 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.155284 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.155292 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.257488 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.257570 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.257586 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.257604 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.257617 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.360314 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.360372 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.360393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.360417 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.360434 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.463293 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.463347 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.463360 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.463375 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.463386 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.565832 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.565887 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.565903 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.565926 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.565943 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.668428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.668480 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.668491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.668506 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.668520 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.732339 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 11:45:54.639144091 +0000 UTC Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.744949 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.744988 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:38 crc kubenswrapper[4775]: E0127 11:21:38.745167 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.745190 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.745242 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:38 crc kubenswrapper[4775]: E0127 11:21:38.745376 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:38 crc kubenswrapper[4775]: E0127 11:21:38.745534 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:38 crc kubenswrapper[4775]: E0127 11:21:38.745681 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.746390 4775 scope.go:117] "RemoveContainer" containerID="d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.771993 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.772048 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.772075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.772108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.772130 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.877702 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.877757 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.877780 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.877813 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.877873 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.980291 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.980350 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.980367 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.980391 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:38 crc kubenswrapper[4775]: I0127 11:21:38.980445 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:38Z","lastTransitionTime":"2026-01-27T11:21:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.090218 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.090257 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.090269 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.090286 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.090296 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.192127 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.192205 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.192229 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.192277 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.192300 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.193007 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/2.log" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.196647 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.197701 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.198204 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.212871 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.229356 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.253532 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.269031 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.283068 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.299512 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.299553 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.299570 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.299587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.299600 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.302882 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.320155 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.333159 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.346952 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.358763 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.372480 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.386141 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.401344 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.401384 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.401395 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.401409 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.401422 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.407467 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.426215 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"2026-01-27T11:20:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940\\\\n2026-01-27T11:20:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940 to /host/opt/cni/bin/\\\\n2026-01-27T11:20:49Z [verbose] multus-daemon started\\\\n2026-01-27T11:20:49Z [verbose] Readiness Indicator file check\\\\n2026-01-27T11:21:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.441661 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.461789 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.471166 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.503722 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.503776 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.503791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.503816 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.503832 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.605892 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.605932 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.605943 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.605960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.605973 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.709043 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.709093 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.709111 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.709131 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.709145 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.732753 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 18:09:30.967697001 +0000 UTC Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.811762 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.811809 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.811818 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.811835 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.811845 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.914687 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.914750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.914767 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.914790 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:39 crc kubenswrapper[4775]: I0127 11:21:39.914807 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:39Z","lastTransitionTime":"2026-01-27T11:21:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.017739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.017789 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.017807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.017831 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.017847 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.121098 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.121168 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.121191 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.121222 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.121243 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.211607 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/3.log" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.212621 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/2.log" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.216035 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.217615 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" exitCode=1 Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.217678 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.217774 4775 scope.go:117] "RemoveContainer" containerID="d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.219415 4775 scope.go:117] "RemoveContainer" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:21:40 crc kubenswrapper[4775]: E0127 11:21:40.219916 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.223278 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.223335 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.223349 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.223370 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.223385 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.242936 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.265662 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"2026-01-27T11:20:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940\\\\n2026-01-27T11:20:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940 to /host/opt/cni/bin/\\\\n2026-01-27T11:20:49Z [verbose] multus-daemon started\\\\n2026-01-27T11:20:49Z [verbose] Readiness Indicator file check\\\\n2026-01-27T11:21:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.288481 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.318738 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d17c195a1cc20d9597f8a97c305b73195b68f9e0aea8638e77f8a3ec38783bed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:13Z\\\",\\\"message\\\":\\\"ller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:13Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:13.624082 6431 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-service-ca-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"2a3fb1a3-a476-4e14-bcf5-fb79af60206a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-service-ca-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:39Z\\\",\\\"message\\\":\\\"ons: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:39.698903 6850 services_controller.go:454] Service openshift-controller-manager/controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0127 11:21:39.698995 6850 services_controller.go:434] Service openshift-route-controller-manager/route-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{route-controller-manager openshift-route-controller-manager 754a1504-193a-42d9-b250-5d40bcccc281 4720 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:route-controller-manager] map[operator.openshift.io/spec-hash:a480352ea60c2dcd2b3870bf0c3650528ef9b51aaa3fe6baa1e3711da18fffa3 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.325958 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.326060 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.326079 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.326106 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.326123 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.336320 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.356639 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.376930 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.395571 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.415212 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.430688 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.430748 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.430766 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.430789 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.430805 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.431780 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.451694 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.475429 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.492184 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.517117 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.533557 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.533615 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.533631 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.533651 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.533669 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.537728 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.555387 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.576229 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:40Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.636887 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.636953 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.636970 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.636997 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.637017 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.733190 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 11:02:39.325860329 +0000 UTC Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.739675 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.739739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.739757 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.739780 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.739797 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.744298 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.744383 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.744408 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.744496 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:40 crc kubenswrapper[4775]: E0127 11:21:40.744564 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:40 crc kubenswrapper[4775]: E0127 11:21:40.744626 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:40 crc kubenswrapper[4775]: E0127 11:21:40.744808 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:40 crc kubenswrapper[4775]: E0127 11:21:40.745028 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.844144 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.845107 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.845126 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.845147 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.845159 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.947552 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.947604 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.947618 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.947638 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:40 crc kubenswrapper[4775]: I0127 11:21:40.947654 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:40Z","lastTransitionTime":"2026-01-27T11:21:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.050508 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.050561 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.050575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.050593 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.050606 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.153371 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.153493 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.153513 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.153541 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.153558 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.227347 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/3.log" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.230065 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.232178 4775 scope.go:117] "RemoveContainer" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:21:41 crc kubenswrapper[4775]: E0127 11:21:41.232521 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.247717 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.256102 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.256167 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.256184 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.256208 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.256227 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.263634 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.281986 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.301928 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.319733 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"2026-01-27T11:20:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940\\\\n2026-01-27T11:20:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940 to /host/opt/cni/bin/\\\\n2026-01-27T11:20:49Z [verbose] multus-daemon started\\\\n2026-01-27T11:20:49Z [verbose] Readiness Indicator file check\\\\n2026-01-27T11:21:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.341404 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.358352 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.358424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.358483 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.358529 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.358553 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.370839 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:39Z\\\",\\\"message\\\":\\\"ons: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:39.698903 6850 services_controller.go:454] Service openshift-controller-manager/controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0127 11:21:39.698995 6850 services_controller.go:434] Service openshift-route-controller-manager/route-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{route-controller-manager openshift-route-controller-manager 754a1504-193a-42d9-b250-5d40bcccc281 4720 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:route-controller-manager] map[operator.openshift.io/spec-hash:a480352ea60c2dcd2b3870bf0c3650528ef9b51aaa3fe6baa1e3711da18fffa3 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.383952 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.403766 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.420099 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.432688 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.447283 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.461669 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.461879 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.461914 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.461924 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.461939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.461952 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.477136 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.492014 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.507043 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.517291 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.563764 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.563810 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.563822 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.563839 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.563852 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.666750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.666795 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.667077 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.667105 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.667119 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.733832 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 08:00:24.198113694 +0000 UTC Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.762256 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.769643 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.769668 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.769676 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.769690 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.769700 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.774868 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.788253 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"2026-01-27T11:20:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940\\\\n2026-01-27T11:20:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940 to /host/opt/cni/bin/\\\\n2026-01-27T11:20:49Z [verbose] multus-daemon started\\\\n2026-01-27T11:20:49Z [verbose] Readiness Indicator file check\\\\n2026-01-27T11:21:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.810425 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.840504 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:39Z\\\",\\\"message\\\":\\\"ons: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:39.698903 6850 services_controller.go:454] Service openshift-controller-manager/controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0127 11:21:39.698995 6850 services_controller.go:434] Service openshift-route-controller-manager/route-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{route-controller-manager openshift-route-controller-manager 754a1504-193a-42d9-b250-5d40bcccc281 4720 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:route-controller-manager] map[operator.openshift.io/spec-hash:a480352ea60c2dcd2b3870bf0c3650528ef9b51aaa3fe6baa1e3711da18fffa3 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.859537 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.872231 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.872285 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.872304 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.872330 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.872348 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.874258 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.892965 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.907773 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.925552 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.938308 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.952021 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.967032 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.975132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.975180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.975189 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.975203 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.975212 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:41Z","lastTransitionTime":"2026-01-27T11:21:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.983250 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:41 crc kubenswrapper[4775]: I0127 11:21:41.996674 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:41Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.011694 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:42Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.025162 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:42Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.079085 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.079165 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.079192 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.079223 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.079244 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.181766 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.181828 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.181845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.181869 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.181885 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.284862 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.284960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.284979 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.285004 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.285024 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.388162 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.388223 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.388237 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.388257 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.388272 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.491397 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.491478 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.491495 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.491517 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.491535 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.595950 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.596008 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.596027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.596050 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.596068 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.698472 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.698499 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.698508 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.698520 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.698530 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.734895 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 07:07:48.736426403 +0000 UTC Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.744288 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.744368 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.744383 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:42 crc kubenswrapper[4775]: E0127 11:21:42.744535 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.744583 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:42 crc kubenswrapper[4775]: E0127 11:21:42.744684 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:42 crc kubenswrapper[4775]: E0127 11:21:42.744944 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:42 crc kubenswrapper[4775]: E0127 11:21:42.745016 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.803496 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.803582 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.803632 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.803681 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.803707 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.906395 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.906489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.906510 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.906536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.906554 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.973196 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.973269 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.973293 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.973324 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.973347 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:42 crc kubenswrapper[4775]: E0127 11:21:42.988376 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:42Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.993843 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.993909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.993927 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.993952 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:42 crc kubenswrapper[4775]: I0127 11:21:42.993969 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:42Z","lastTransitionTime":"2026-01-27T11:21:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: E0127 11:21:43.014981 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:43Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.018744 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.018791 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.018804 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.018825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.018839 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: E0127 11:21:43.036678 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:43Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.041298 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.041351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.041370 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.041394 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.041412 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: E0127 11:21:43.060576 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:43Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.065554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.065624 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.065651 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.065680 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.065702 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: E0127 11:21:43.085184 4775 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"a0dbf9f1-4fdf-4ebf-a3ba-37ba333f18f2\\\",\\\"systemUUID\\\":\\\"574d97c2-3ebe-40ee-9434-ec47862a34d4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:43Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:43 crc kubenswrapper[4775]: E0127 11:21:43.085407 4775 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.088169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.088221 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.088242 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.088435 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.088525 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.191825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.191864 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.191873 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.191887 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.191897 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.294554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.294625 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.294647 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.294677 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.294696 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.397399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.397445 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.397474 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.397490 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.397501 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.500041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.500097 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.500116 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.500141 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.500161 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.603377 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.603478 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.603490 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.603505 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.603516 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.706751 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.706803 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.706814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.706829 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.706839 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.735172 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 06:14:40.153876792 +0000 UTC Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.810071 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.810599 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.810666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.810699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.810721 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.913473 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.913602 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.913621 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.913648 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:43 crc kubenswrapper[4775]: I0127 11:21:43.913665 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:43Z","lastTransitionTime":"2026-01-27T11:21:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.016865 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.016938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.016964 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.016993 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.017019 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.119489 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.119523 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.119536 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.119579 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.119593 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.221578 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.221612 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.221623 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.221637 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.221645 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.323750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.323807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.323824 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.323845 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.323859 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.426817 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.426871 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.426888 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.426912 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.426928 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.529765 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.529937 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.529962 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.529984 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.530003 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.632914 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.633010 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.633027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.633049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.633065 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.736183 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 00:06:03.070524531 +0000 UTC Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.739571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.739661 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.739681 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.739707 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.739742 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.743916 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:44 crc kubenswrapper[4775]: E0127 11:21:44.744098 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.744370 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:44 crc kubenswrapper[4775]: E0127 11:21:44.744509 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.744719 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:44 crc kubenswrapper[4775]: E0127 11:21:44.744829 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.745053 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:44 crc kubenswrapper[4775]: E0127 11:21:44.745152 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.842943 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.843006 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.843027 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.843056 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.843078 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.946710 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.946789 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.946814 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.946842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:44 crc kubenswrapper[4775]: I0127 11:21:44.946862 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:44Z","lastTransitionTime":"2026-01-27T11:21:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.049807 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.049867 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.049886 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.049908 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.049925 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.152833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.152891 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.152909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.152938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.152960 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.255996 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.256108 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.256128 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.256153 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.256171 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.359367 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.359476 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.359496 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.359527 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.359551 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.462442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.462571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.462599 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.462628 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.462649 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.564519 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.564565 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.564580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.564596 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.564606 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.668033 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.668094 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.668112 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.668139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.668158 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.736783 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 15:58:02.548296856 +0000 UTC Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.771076 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.771571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.771595 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.771620 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.771638 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.873801 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.873830 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.873838 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.873852 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.873860 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.976436 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.976514 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.976530 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.976552 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:45 crc kubenswrapper[4775]: I0127 11:21:45.976570 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:45Z","lastTransitionTime":"2026-01-27T11:21:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.079795 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.079843 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.079855 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.079873 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.079886 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.182548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.182608 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.182626 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.182652 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.182672 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.285424 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.285508 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.285526 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.285548 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.285565 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.388421 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.388503 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.388522 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.388547 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.388565 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.491526 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.491596 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.491614 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.491639 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.491661 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.594911 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.595001 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.595052 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.595079 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.595097 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.698376 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.698416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.698428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.698442 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.698468 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.737093 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 02:39:44.434089731 +0000 UTC Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.744531 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.744571 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.744605 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:46 crc kubenswrapper[4775]: E0127 11:21:46.744757 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.744844 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:46 crc kubenswrapper[4775]: E0127 11:21:46.744968 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:46 crc kubenswrapper[4775]: E0127 11:21:46.745140 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:46 crc kubenswrapper[4775]: E0127 11:21:46.745383 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.801220 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.801292 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.801310 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.801334 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.801352 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.904522 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.904580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.904600 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.904625 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:46 crc kubenswrapper[4775]: I0127 11:21:46.904644 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:46Z","lastTransitionTime":"2026-01-27T11:21:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.007360 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.007400 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.007411 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.007430 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.007478 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.111039 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.111117 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.111137 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.111163 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.111183 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.213970 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.214060 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.214079 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.214103 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.214121 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.317351 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.317412 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.317430 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.317480 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.317497 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.420833 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.420891 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.420908 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.420931 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.420950 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.524429 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.524543 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.524564 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.524587 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.524605 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.629025 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.629090 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.629109 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.629132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.629149 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.732680 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.732750 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.732771 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.732794 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.732811 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.738013 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 16:17:09.478867748 +0000 UTC Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.835731 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.835810 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.835835 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.835870 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.835896 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.938407 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.938507 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.938532 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.938556 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:47 crc kubenswrapper[4775]: I0127 11:21:47.938573 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:47Z","lastTransitionTime":"2026-01-27T11:21:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.041181 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.041240 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.041252 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.041278 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.041306 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.144431 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.144531 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.144554 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.144584 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.144605 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.248169 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.248247 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.248275 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.248305 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.248329 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.351985 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.352035 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.352049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.352067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.352079 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.455342 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.455393 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.455405 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.455422 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.455436 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.558491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.558547 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.558560 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.558575 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.558588 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.587362 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.587542 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.587603 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.587574508 +0000 UTC m=+151.729172305 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.587679 4775 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.587726 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.587735 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.587717783 +0000 UTC m=+151.729315570 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.587893 4775 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.587969 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.587956769 +0000 UTC m=+151.729554556 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.661319 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.661369 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.661381 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.661399 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.661412 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.689268 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.689327 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689525 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689550 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689567 4775 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689634 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.689617573 +0000 UTC m=+151.831215360 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689644 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689691 4775 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689712 4775 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.689803 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.689777548 +0000 UTC m=+151.831375355 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.738535 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 16:00:09.400367687 +0000 UTC Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.744871 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.744969 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.744995 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.745261 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.745265 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.745400 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.745521 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:48 crc kubenswrapper[4775]: E0127 11:21:48.745682 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.764290 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.764327 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.764340 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.764356 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.764368 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.868283 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.868361 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.868384 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.868416 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.868441 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.971041 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.971150 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.971204 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.971228 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:48 crc kubenswrapper[4775]: I0127 11:21:48.971248 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:48Z","lastTransitionTime":"2026-01-27T11:21:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.074062 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.074122 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.074139 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.074168 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.074184 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.177560 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.177622 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.177642 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.177666 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.177687 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.280172 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.280223 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.280233 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.280251 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.280262 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.384084 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.384201 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.384230 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.384262 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.384291 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.487738 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.487796 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.487812 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.487834 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.487851 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.591751 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.591850 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.591871 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.591893 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.591910 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.695081 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.695160 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.695183 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.695209 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.695226 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.756009 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 23:02:26.954352502 +0000 UTC Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.798404 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.798517 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.798537 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.798561 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.798580 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.902398 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.902491 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.902510 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.902534 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:49 crc kubenswrapper[4775]: I0127 11:21:49.902552 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:49Z","lastTransitionTime":"2026-01-27T11:21:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.004946 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.005049 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.005062 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.005075 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.005084 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.107697 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.107742 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.107783 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.107806 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.107822 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.210242 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.210294 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.210310 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.210331 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.210349 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.313683 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.313821 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.313849 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.313880 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.313902 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.416660 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.416925 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.416938 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.416958 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.416974 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.520132 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.520180 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.520191 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.520208 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.520220 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.623808 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.623869 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.623886 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.623909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.623927 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.726939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.726988 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.727000 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.727019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.727030 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.744778 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.744832 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.744779 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:50 crc kubenswrapper[4775]: E0127 11:21:50.744900 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.744868 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:50 crc kubenswrapper[4775]: E0127 11:21:50.745002 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:50 crc kubenswrapper[4775]: E0127 11:21:50.745149 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:50 crc kubenswrapper[4775]: E0127 11:21:50.745348 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.756509 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 22:06:21.739436664 +0000 UTC Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.830208 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.830266 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.830286 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.830311 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.830330 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.933157 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.933215 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.933235 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.933259 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:50 crc kubenswrapper[4775]: I0127 11:21:50.933277 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:50Z","lastTransitionTime":"2026-01-27T11:21:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.036615 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.036682 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.036698 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.036726 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.036742 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.139890 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.139945 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.139962 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.139986 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.140006 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.242642 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.242699 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.242716 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.242739 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.242757 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.346382 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.346433 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.346477 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.346501 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.346518 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.449641 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.449704 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.449720 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.449748 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.449771 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.553067 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.553550 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.553784 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.554009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.554215 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.657046 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.657634 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.657738 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.657781 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.657822 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.757574 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 07:43:15.416371567 +0000 UTC Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.760349 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.760410 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.760428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.760479 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.760499 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.768555 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.788611 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5d6c2c26b5723170eed90f2863cb7fbadc0880e4d75fba59be150deea37d66ee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcffa5c28966866aedb8e91329e31ef7bf36c60a58f15f79fe6720ec030e5ac5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.809517 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-gm7w4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aba2edc6-0e64-4995-830d-e177919ea13e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:34Z\\\",\\\"message\\\":\\\"2026-01-27T11:20:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940\\\\n2026-01-27T11:20:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_c2fbf43f-fbbe-46f3-a71a-d5caba311940 to /host/opt/cni/bin/\\\\n2026-01-27T11:20:49Z [verbose] multus-daemon started\\\\n2026-01-27T11:20:49Z [verbose] Readiness Indicator file check\\\\n2026-01-27T11:21:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pj4jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-gm7w4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.832615 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"404c5bcc-dd1d-479b-8ce2-2b9fd6f2db9d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://09a3895e6604900f4cbfef3ff2de6aa31aa08600e39bc4be6a6ca98a1a637fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9b1fcec6ed80413c9e5839a55479efe79537fb678af6d96a566cdc93cf88d69c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ac9f48605a7af406581a4c8747c0eb9a95738177df3991bae6153fd33658936\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://782430ed7dc4da0c5bd887c59c7eab2c1832724386a99d0cba1157acf8a977f0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d481ce348d355a996111d5c5fc94c9889756a400dca3a0860915d64f2848c20c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aeefad34a509cf14e5bdd63b8ebfecd4a2a845c3ea0b488d2491c1192265ea9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b22e6df3358997a8b3dd665d53b5a4dc65f9075986e0c77b875c521e7a8f5dda\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mj6q4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dcnmf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.862825 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.862883 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.862907 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.862939 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.862962 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.867209 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7d657d41-09b6-43f2-babb-4cb13a62fd1f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T11:21:39Z\\\",\\\"message\\\":\\\"ons: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:39Z is after 2025-08-24T17:21:41Z]\\\\nI0127 11:21:39.698903 6850 services_controller.go:454] Service openshift-controller-manager/controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0127 11:21:39.698995 6850 services_controller.go:434] Service openshift-route-controller-manager/route-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{route-controller-manager openshift-route-controller-manager 754a1504-193a-42d9-b250-5d40bcccc281 4720 0 2025-02-23 05:22:48 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:route-controller-manager] map[operator.openshift.io/spec-hash:a480352ea60c2dcd2b3870bf0c3650528ef9b51aaa3fe6baa1e3711da18fffa3 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePo\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:21:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-czdm4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:46Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nzthg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.883749 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9dz9r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c1ce49b6-6832-4f61-bad3-63174f36eba9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://280e6d912cd3bbd01de0d7438d744e4b90deb06d6aac3eecc074d703f787eb6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hhgjw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:48Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9dz9r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.899405 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-b48nk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c945c8b1-655c-4522-b703-0c5b9b8fcf38\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m5frt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:59Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-b48nk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.917871 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4c79a8c7-4ea7-481d-a30e-81dfc645959f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d920efaa373020675c0d72ccb3dc167347139de6a79847ad3124cac76371490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://768e06c9b15c381ffb01fe5b64de8fa4971393d4f14c3dc7b79b0e03fa21b9f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a4f463974ba99187ffed4628f6eff5e1ec35b2f951d4a1d67673034f38e14736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5760d4a3f2141d061cf45c08bf3859ff09e0c977fb117569ec631917323c7ce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.938953 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f2ad5463-900c-4c6a-b8f8-4961abf97877\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"le observer\\\\nW0127 11:20:44.220835 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0127 11:20:44.220954 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 11:20:44.221778 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2797378847/tls.crt::/tmp/serving-cert-2797378847/tls.key\\\\\\\"\\\\nI0127 11:20:44.889854 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 11:20:44.892614 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 11:20:44.892637 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 11:20:44.892654 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 11:20:44.892661 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 11:20:44.906356 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 11:20:44.906382 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906388 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 11:20:44.906392 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nI0127 11:20:44.906389 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 11:20:44.906396 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 11:20:44.906419 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 11:20:44.906427 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 11:20:44.907780 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:21:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T11:20:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.958620 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c0e66541e2f5c095ba63c52cb421be629db2ca58239123ef052d1268da9393f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.965974 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.966019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.966031 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.966053 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.966063 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:51Z","lastTransitionTime":"2026-01-27T11:21:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.974526 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0c1777ea0d09a6a8118cc2f86605a15c2d2db3f5a09f000ce9dd881e8c917adf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:51 crc kubenswrapper[4775]: I0127 11:21:51.991592 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7707cf23-0a23-4f57-8184-f7a4f7587aa2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41f04403bb2da7971ecd0b1b3a957b167d13a094296654c490b1f6ba02cb0bd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2tdkh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qn99x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:51Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.011178 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66564dcb-1fe2-45d0-a1ab-c352ec5a4837\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c6555005224f9e10606a7740db2af0bd0c4d2305148c3da56ed66ea6c53f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce3bfed3013607646d4d613ffef74e1c8511a17209b7a62b943cde556acddf45\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b924ce8c0c1247b9afe7e4200b6664b0291a5d5a3d2eb8795d982f40f068326d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.031395 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.049272 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"722c4ef1-b8ec-4732-908b-4c697d7eef60\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b797415c0815db7d12411613acb795e22ca87614178c4a8a3c9ba7c90d10d9eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://971fc01faa6d5178e21b1503855381856e18bebd0649991dbc274e259366cea4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mjhfr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:57Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-7jxr5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.069445 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.069617 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.069673 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.069690 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.069716 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.069733 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.086624 4775 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-vxn5f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0c38486b-7aef-4d58-8637-207994a976d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T11:20:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b7d490610a9a9a92e6328119672701835379ba8bbff31154ba2b008eacd20ba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T11:20:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-c2wmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T11:20:44Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-vxn5f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T11:21:52Z is after 2025-08-24T17:21:41Z" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.172777 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.172881 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.172901 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.172925 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.172942 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.277360 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.277428 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.277446 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.277495 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.277513 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.380864 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.380982 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.381109 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.381145 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.381181 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.483529 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.483570 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.483580 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.483594 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.483603 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.586228 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.586307 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.586330 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.586363 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.586388 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.689735 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.689896 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.689921 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.689951 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.689977 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.744753 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.744826 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.744941 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:52 crc kubenswrapper[4775]: E0127 11:21:52.744942 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.744768 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:52 crc kubenswrapper[4775]: E0127 11:21:52.745202 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:52 crc kubenswrapper[4775]: E0127 11:21:52.745338 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:52 crc kubenswrapper[4775]: E0127 11:21:52.745722 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.758249 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 17:43:44.49215773 +0000 UTC Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.793302 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.793355 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.793378 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.793405 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.793429 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.896778 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.896843 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.896860 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.896886 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.896903 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.999529 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.999571 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.999582 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.999599 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:52 crc kubenswrapper[4775]: I0127 11:21:52.999610 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:52Z","lastTransitionTime":"2026-01-27T11:21:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.102904 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.102983 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.103009 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.103042 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.103061 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:53Z","lastTransitionTime":"2026-01-27T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.206304 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.206434 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.206484 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.206515 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.206539 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:53Z","lastTransitionTime":"2026-01-27T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.308716 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.308823 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.308842 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.308909 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.308930 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:53Z","lastTransitionTime":"2026-01-27T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.412365 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.412413 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.412432 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.412485 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.412504 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:53Z","lastTransitionTime":"2026-01-27T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.414898 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.414943 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.414960 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.414985 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.415002 4775 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T11:21:53Z","lastTransitionTime":"2026-01-27T11:21:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.487045 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l"] Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.487893 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.492588 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.494153 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.494277 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.494848 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.547092 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4114802c-bae6-4711-b8c9-32a0cd83395a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.547141 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4114802c-bae6-4711-b8c9-32a0cd83395a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.547195 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4114802c-bae6-4711-b8c9-32a0cd83395a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.547226 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4114802c-bae6-4711-b8c9-32a0cd83395a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.547284 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4114802c-bae6-4711-b8c9-32a0cd83395a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.553478 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gm7w4" podStartSLOduration=68.553431296 podStartE2EDuration="1m8.553431296s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.55319672 +0000 UTC m=+92.694794497" watchObservedRunningTime="2026-01-27 11:21:53.553431296 +0000 UTC m=+92.695029083" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.578536 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dcnmf" podStartSLOduration=68.578514243 podStartE2EDuration="1m8.578514243s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.578312058 +0000 UTC m=+92.719909845" watchObservedRunningTime="2026-01-27 11:21:53.578514243 +0000 UTC m=+92.720112020" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.638018 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9dz9r" podStartSLOduration=68.637991913 podStartE2EDuration="1m8.637991913s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.637938912 +0000 UTC m=+92.779536709" watchObservedRunningTime="2026-01-27 11:21:53.637991913 +0000 UTC m=+92.779589720" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.648316 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4114802c-bae6-4711-b8c9-32a0cd83395a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.648399 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4114802c-bae6-4711-b8c9-32a0cd83395a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.648437 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4114802c-bae6-4711-b8c9-32a0cd83395a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.648550 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4114802c-bae6-4711-b8c9-32a0cd83395a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.648604 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4114802c-bae6-4711-b8c9-32a0cd83395a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.648712 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4114802c-bae6-4711-b8c9-32a0cd83395a-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.648772 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4114802c-bae6-4711-b8c9-32a0cd83395a-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.650419 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4114802c-bae6-4711-b8c9-32a0cd83395a-service-ca\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.658609 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4114802c-bae6-4711-b8c9-32a0cd83395a-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.678180 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4114802c-bae6-4711-b8c9-32a0cd83395a-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-ws68l\" (UID: \"4114802c-bae6-4711-b8c9-32a0cd83395a\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.703505 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.703486026 podStartE2EDuration="1m8.703486026s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.690541982 +0000 UTC m=+92.832139849" watchObservedRunningTime="2026-01-27 11:21:53.703486026 +0000 UTC m=+92.845083803" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.731361 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podStartSLOduration=68.73134458 podStartE2EDuration="1m8.73134458s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.730780865 +0000 UTC m=+92.872378662" watchObservedRunningTime="2026-01-27 11:21:53.73134458 +0000 UTC m=+92.872942357" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.745101 4775 scope.go:117] "RemoveContainer" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:21:53 crc kubenswrapper[4775]: E0127 11:21:53.745296 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.752692 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.752663604 podStartE2EDuration="42.752663604s" podCreationTimestamp="2026-01-27 11:21:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.750153246 +0000 UTC m=+92.891751053" watchObservedRunningTime="2026-01-27 11:21:53.752663604 +0000 UTC m=+92.894261421" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.759197 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 08:58:51.329426498 +0000 UTC Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.759264 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.765794 4775 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.797622 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-7jxr5" podStartSLOduration=68.797597575 podStartE2EDuration="1m8.797597575s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.779621073 +0000 UTC m=+92.921218890" watchObservedRunningTime="2026-01-27 11:21:53.797597575 +0000 UTC m=+92.939195392" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.798202 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=67.798193031 podStartE2EDuration="1m7.798193031s" podCreationTimestamp="2026-01-27 11:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.796991108 +0000 UTC m=+92.938588885" watchObservedRunningTime="2026-01-27 11:21:53.798193031 +0000 UTC m=+92.939790848" Jan 27 11:21:53 crc kubenswrapper[4775]: I0127 11:21:53.808730 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" Jan 27 11:21:53 crc kubenswrapper[4775]: W0127 11:21:53.832593 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4114802c_bae6_4711_b8c9_32a0cd83395a.slice/crio-ba3812dc1073082c6b6e1070cd1bd508b7b14e5f400732f34e50b62a32ee0cc1 WatchSource:0}: Error finding container ba3812dc1073082c6b6e1070cd1bd508b7b14e5f400732f34e50b62a32ee0cc1: Status 404 returned error can't find the container with id ba3812dc1073082c6b6e1070cd1bd508b7b14e5f400732f34e50b62a32ee0cc1 Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.283111 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" event={"ID":"4114802c-bae6-4711-b8c9-32a0cd83395a","Type":"ContainerStarted","Data":"2ead5dfdc7813ef890cc5608717b27e8d92dffc62dd8f7e7bfc20b5227843729"} Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.283184 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" event={"ID":"4114802c-bae6-4711-b8c9-32a0cd83395a","Type":"ContainerStarted","Data":"ba3812dc1073082c6b6e1070cd1bd508b7b14e5f400732f34e50b62a32ee0cc1"} Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.303490 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-ws68l" podStartSLOduration=69.303419791 podStartE2EDuration="1m9.303419791s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:54.303093492 +0000 UTC m=+93.444691309" watchObservedRunningTime="2026-01-27 11:21:54.303419791 +0000 UTC m=+93.445017598" Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.304395 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-vxn5f" podStartSLOduration=70.304379436 podStartE2EDuration="1m10.304379436s" podCreationTimestamp="2026-01-27 11:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:21:53.853687342 +0000 UTC m=+92.995285169" watchObservedRunningTime="2026-01-27 11:21:54.304379436 +0000 UTC m=+93.445977253" Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.744171 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.744298 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.744593 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:54 crc kubenswrapper[4775]: E0127 11:21:54.744621 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:54 crc kubenswrapper[4775]: I0127 11:21:54.744683 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:54 crc kubenswrapper[4775]: E0127 11:21:54.744735 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:54 crc kubenswrapper[4775]: E0127 11:21:54.744822 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:54 crc kubenswrapper[4775]: E0127 11:21:54.744474 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:56 crc kubenswrapper[4775]: I0127 11:21:56.744169 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:56 crc kubenswrapper[4775]: E0127 11:21:56.744683 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:56 crc kubenswrapper[4775]: I0127 11:21:56.744248 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:56 crc kubenswrapper[4775]: I0127 11:21:56.744214 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:56 crc kubenswrapper[4775]: E0127 11:21:56.744793 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:56 crc kubenswrapper[4775]: I0127 11:21:56.744297 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:56 crc kubenswrapper[4775]: E0127 11:21:56.744986 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:56 crc kubenswrapper[4775]: E0127 11:21:56.745149 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:21:56 crc kubenswrapper[4775]: I0127 11:21:56.762605 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 27 11:21:58 crc kubenswrapper[4775]: I0127 11:21:58.744041 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:21:58 crc kubenswrapper[4775]: I0127 11:21:58.744042 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:21:58 crc kubenswrapper[4775]: I0127 11:21:58.744073 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:21:58 crc kubenswrapper[4775]: I0127 11:21:58.744189 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:21:58 crc kubenswrapper[4775]: E0127 11:21:58.744408 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:21:58 crc kubenswrapper[4775]: E0127 11:21:58.744780 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:21:58 crc kubenswrapper[4775]: E0127 11:21:58.745005 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:21:58 crc kubenswrapper[4775]: E0127 11:21:58.745220 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:00 crc kubenswrapper[4775]: I0127 11:22:00.744908 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:00 crc kubenswrapper[4775]: I0127 11:22:00.745015 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:00 crc kubenswrapper[4775]: I0127 11:22:00.744926 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:00 crc kubenswrapper[4775]: E0127 11:22:00.745186 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:00 crc kubenswrapper[4775]: I0127 11:22:00.745597 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:00 crc kubenswrapper[4775]: E0127 11:22:00.745711 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:00 crc kubenswrapper[4775]: E0127 11:22:00.745836 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:00 crc kubenswrapper[4775]: E0127 11:22:00.746109 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:00 crc kubenswrapper[4775]: I0127 11:22:00.769402 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 27 11:22:01 crc kubenswrapper[4775]: I0127 11:22:01.803496 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=5.803429447 podStartE2EDuration="5.803429447s" podCreationTimestamp="2026-01-27 11:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:01.762769863 +0000 UTC m=+100.904367650" watchObservedRunningTime="2026-01-27 11:22:01.803429447 +0000 UTC m=+100.945027254" Jan 27 11:22:01 crc kubenswrapper[4775]: I0127 11:22:01.805758 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=1.80574344 podStartE2EDuration="1.80574344s" podCreationTimestamp="2026-01-27 11:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:01.802982605 +0000 UTC m=+100.944580462" watchObservedRunningTime="2026-01-27 11:22:01.80574344 +0000 UTC m=+100.947341257" Jan 27 11:22:02 crc kubenswrapper[4775]: I0127 11:22:02.744961 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:02 crc kubenswrapper[4775]: I0127 11:22:02.745027 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:02 crc kubenswrapper[4775]: I0127 11:22:02.745035 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:02 crc kubenswrapper[4775]: E0127 11:22:02.745149 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:02 crc kubenswrapper[4775]: E0127 11:22:02.745241 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:02 crc kubenswrapper[4775]: I0127 11:22:02.745240 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:02 crc kubenswrapper[4775]: E0127 11:22:02.745661 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:02 crc kubenswrapper[4775]: E0127 11:22:02.745846 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:02 crc kubenswrapper[4775]: I0127 11:22:02.977629 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:02 crc kubenswrapper[4775]: E0127 11:22:02.977809 4775 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:22:02 crc kubenswrapper[4775]: E0127 11:22:02.977931 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs podName:c945c8b1-655c-4522-b703-0c5b9b8fcf38 nodeName:}" failed. No retries permitted until 2026-01-27 11:23:06.977890598 +0000 UTC m=+166.119488415 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs") pod "network-metrics-daemon-b48nk" (UID: "c945c8b1-655c-4522-b703-0c5b9b8fcf38") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 11:22:04 crc kubenswrapper[4775]: I0127 11:22:04.744640 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:04 crc kubenswrapper[4775]: E0127 11:22:04.744792 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:04 crc kubenswrapper[4775]: I0127 11:22:04.745673 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:04 crc kubenswrapper[4775]: I0127 11:22:04.745750 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:04 crc kubenswrapper[4775]: I0127 11:22:04.745795 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:04 crc kubenswrapper[4775]: E0127 11:22:04.745933 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:04 crc kubenswrapper[4775]: E0127 11:22:04.746323 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:04 crc kubenswrapper[4775]: E0127 11:22:04.746591 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:05 crc kubenswrapper[4775]: I0127 11:22:05.745570 4775 scope.go:117] "RemoveContainer" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:22:05 crc kubenswrapper[4775]: E0127 11:22:05.745745 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:22:06 crc kubenswrapper[4775]: I0127 11:22:06.744113 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:06 crc kubenswrapper[4775]: I0127 11:22:06.744183 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:06 crc kubenswrapper[4775]: I0127 11:22:06.744133 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:06 crc kubenswrapper[4775]: E0127 11:22:06.744274 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:06 crc kubenswrapper[4775]: E0127 11:22:06.744570 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:06 crc kubenswrapper[4775]: I0127 11:22:06.744689 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:06 crc kubenswrapper[4775]: E0127 11:22:06.744727 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:06 crc kubenswrapper[4775]: E0127 11:22:06.745220 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:08 crc kubenswrapper[4775]: I0127 11:22:08.744274 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:08 crc kubenswrapper[4775]: I0127 11:22:08.744835 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:08 crc kubenswrapper[4775]: E0127 11:22:08.745049 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:08 crc kubenswrapper[4775]: I0127 11:22:08.745159 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:08 crc kubenswrapper[4775]: I0127 11:22:08.745200 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:08 crc kubenswrapper[4775]: E0127 11:22:08.745759 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:08 crc kubenswrapper[4775]: E0127 11:22:08.746048 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:08 crc kubenswrapper[4775]: E0127 11:22:08.746343 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:10 crc kubenswrapper[4775]: I0127 11:22:10.744629 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:10 crc kubenswrapper[4775]: I0127 11:22:10.744697 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:10 crc kubenswrapper[4775]: I0127 11:22:10.744826 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:10 crc kubenswrapper[4775]: I0127 11:22:10.744892 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:10 crc kubenswrapper[4775]: E0127 11:22:10.744942 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:10 crc kubenswrapper[4775]: E0127 11:22:10.745029 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:10 crc kubenswrapper[4775]: E0127 11:22:10.745135 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:10 crc kubenswrapper[4775]: E0127 11:22:10.745211 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:12 crc kubenswrapper[4775]: I0127 11:22:12.744222 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:12 crc kubenswrapper[4775]: I0127 11:22:12.744268 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:12 crc kubenswrapper[4775]: I0127 11:22:12.744240 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:12 crc kubenswrapper[4775]: E0127 11:22:12.744372 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:12 crc kubenswrapper[4775]: I0127 11:22:12.744434 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:12 crc kubenswrapper[4775]: E0127 11:22:12.744609 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:12 crc kubenswrapper[4775]: E0127 11:22:12.744782 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:12 crc kubenswrapper[4775]: E0127 11:22:12.744858 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:14 crc kubenswrapper[4775]: I0127 11:22:14.744484 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:14 crc kubenswrapper[4775]: I0127 11:22:14.744570 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:14 crc kubenswrapper[4775]: I0127 11:22:14.744488 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:14 crc kubenswrapper[4775]: E0127 11:22:14.744664 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:14 crc kubenswrapper[4775]: I0127 11:22:14.744516 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:14 crc kubenswrapper[4775]: E0127 11:22:14.744820 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:14 crc kubenswrapper[4775]: E0127 11:22:14.745015 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:14 crc kubenswrapper[4775]: E0127 11:22:14.745104 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:16 crc kubenswrapper[4775]: I0127 11:22:16.743943 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:16 crc kubenswrapper[4775]: I0127 11:22:16.744042 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:16 crc kubenswrapper[4775]: I0127 11:22:16.744674 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:16 crc kubenswrapper[4775]: I0127 11:22:16.744685 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:16 crc kubenswrapper[4775]: E0127 11:22:16.744760 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:16 crc kubenswrapper[4775]: E0127 11:22:16.745022 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:16 crc kubenswrapper[4775]: E0127 11:22:16.745121 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:16 crc kubenswrapper[4775]: E0127 11:22:16.745182 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:16 crc kubenswrapper[4775]: I0127 11:22:16.745498 4775 scope.go:117] "RemoveContainer" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:22:16 crc kubenswrapper[4775]: E0127 11:22:16.745843 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nzthg_openshift-ovn-kubernetes(7d657d41-09b6-43f2-babb-4cb13a62fd1f)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" Jan 27 11:22:18 crc kubenswrapper[4775]: I0127 11:22:18.744757 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:18 crc kubenswrapper[4775]: I0127 11:22:18.744829 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:18 crc kubenswrapper[4775]: I0127 11:22:18.744876 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:18 crc kubenswrapper[4775]: I0127 11:22:18.744829 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:18 crc kubenswrapper[4775]: E0127 11:22:18.744939 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:18 crc kubenswrapper[4775]: E0127 11:22:18.745154 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:18 crc kubenswrapper[4775]: E0127 11:22:18.745196 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:18 crc kubenswrapper[4775]: E0127 11:22:18.745277 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.376532 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/1.log" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.377355 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/0.log" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.377433 4775 generic.go:334] "Generic (PLEG): container finished" podID="aba2edc6-0e64-4995-830d-e177919ea13e" containerID="750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298" exitCode=1 Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.377532 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerDied","Data":"750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298"} Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.377623 4775 scope.go:117] "RemoveContainer" containerID="e8760aa6be4c3dd6c49603ea8ebd5d3f440304f8aec9c088f5cf625c8c2ff8bc" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.378183 4775 scope.go:117] "RemoveContainer" containerID="750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298" Jan 27 11:22:20 crc kubenswrapper[4775]: E0127 11:22:20.378739 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-gm7w4_openshift-multus(aba2edc6-0e64-4995-830d-e177919ea13e)\"" pod="openshift-multus/multus-gm7w4" podUID="aba2edc6-0e64-4995-830d-e177919ea13e" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.744163 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.744247 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:20 crc kubenswrapper[4775]: E0127 11:22:20.744322 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.744160 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:20 crc kubenswrapper[4775]: I0127 11:22:20.744259 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:20 crc kubenswrapper[4775]: E0127 11:22:20.744484 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:20 crc kubenswrapper[4775]: E0127 11:22:20.744546 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:20 crc kubenswrapper[4775]: E0127 11:22:20.744620 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:21 crc kubenswrapper[4775]: I0127 11:22:21.394663 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/1.log" Jan 27 11:22:21 crc kubenswrapper[4775]: E0127 11:22:21.691445 4775 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 27 11:22:21 crc kubenswrapper[4775]: E0127 11:22:21.857099 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 11:22:22 crc kubenswrapper[4775]: I0127 11:22:22.744855 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:22 crc kubenswrapper[4775]: I0127 11:22:22.744899 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:22 crc kubenswrapper[4775]: I0127 11:22:22.744875 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:22 crc kubenswrapper[4775]: I0127 11:22:22.745029 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:22 crc kubenswrapper[4775]: E0127 11:22:22.745203 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:22 crc kubenswrapper[4775]: E0127 11:22:22.745418 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:22 crc kubenswrapper[4775]: E0127 11:22:22.745570 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:22 crc kubenswrapper[4775]: E0127 11:22:22.745690 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:24 crc kubenswrapper[4775]: I0127 11:22:24.744687 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:24 crc kubenswrapper[4775]: I0127 11:22:24.744778 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:24 crc kubenswrapper[4775]: I0127 11:22:24.744806 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:24 crc kubenswrapper[4775]: I0127 11:22:24.744991 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:24 crc kubenswrapper[4775]: E0127 11:22:24.744969 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:24 crc kubenswrapper[4775]: E0127 11:22:24.745143 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:24 crc kubenswrapper[4775]: E0127 11:22:24.745362 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:24 crc kubenswrapper[4775]: E0127 11:22:24.745736 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:26 crc kubenswrapper[4775]: I0127 11:22:26.744518 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:26 crc kubenswrapper[4775]: I0127 11:22:26.744563 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:26 crc kubenswrapper[4775]: I0127 11:22:26.744569 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:26 crc kubenswrapper[4775]: E0127 11:22:26.744709 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:26 crc kubenswrapper[4775]: I0127 11:22:26.744753 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:26 crc kubenswrapper[4775]: E0127 11:22:26.744934 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:26 crc kubenswrapper[4775]: E0127 11:22:26.745107 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:26 crc kubenswrapper[4775]: E0127 11:22:26.745229 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:26 crc kubenswrapper[4775]: E0127 11:22:26.858648 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 11:22:28 crc kubenswrapper[4775]: I0127 11:22:28.744658 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:28 crc kubenswrapper[4775]: I0127 11:22:28.744691 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:28 crc kubenswrapper[4775]: I0127 11:22:28.744718 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:28 crc kubenswrapper[4775]: I0127 11:22:28.744664 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:28 crc kubenswrapper[4775]: E0127 11:22:28.744833 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:28 crc kubenswrapper[4775]: E0127 11:22:28.744936 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:28 crc kubenswrapper[4775]: E0127 11:22:28.745044 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:28 crc kubenswrapper[4775]: E0127 11:22:28.745123 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:30 crc kubenswrapper[4775]: I0127 11:22:30.744299 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:30 crc kubenswrapper[4775]: I0127 11:22:30.744353 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:30 crc kubenswrapper[4775]: I0127 11:22:30.744401 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:30 crc kubenswrapper[4775]: I0127 11:22:30.744420 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:30 crc kubenswrapper[4775]: E0127 11:22:30.744641 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:30 crc kubenswrapper[4775]: E0127 11:22:30.744856 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:30 crc kubenswrapper[4775]: E0127 11:22:30.744971 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:30 crc kubenswrapper[4775]: E0127 11:22:30.745090 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:30 crc kubenswrapper[4775]: I0127 11:22:30.746185 4775 scope.go:117] "RemoveContainer" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:22:31 crc kubenswrapper[4775]: I0127 11:22:31.434077 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/3.log" Jan 27 11:22:31 crc kubenswrapper[4775]: I0127 11:22:31.436895 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:22:31 crc kubenswrapper[4775]: I0127 11:22:31.437849 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerStarted","Data":"fff264ae37c862c92f04505830404488875026a16f9b83753ca7e41d83f2d007"} Jan 27 11:22:31 crc kubenswrapper[4775]: I0127 11:22:31.438392 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:22:31 crc kubenswrapper[4775]: I0127 11:22:31.470159 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podStartSLOduration=106.470134032 podStartE2EDuration="1m46.470134032s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:31.468149527 +0000 UTC m=+130.609747334" watchObservedRunningTime="2026-01-27 11:22:31.470134032 +0000 UTC m=+130.611731839" Jan 27 11:22:31 crc kubenswrapper[4775]: I0127 11:22:31.581917 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-b48nk"] Jan 27 11:22:31 crc kubenswrapper[4775]: I0127 11:22:31.582044 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:31 crc kubenswrapper[4775]: E0127 11:22:31.582180 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:31 crc kubenswrapper[4775]: E0127 11:22:31.861134 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 11:22:32 crc kubenswrapper[4775]: I0127 11:22:32.744509 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:32 crc kubenswrapper[4775]: I0127 11:22:32.744625 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:32 crc kubenswrapper[4775]: E0127 11:22:32.744682 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:32 crc kubenswrapper[4775]: I0127 11:22:32.744705 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:32 crc kubenswrapper[4775]: E0127 11:22:32.744842 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:32 crc kubenswrapper[4775]: E0127 11:22:32.744950 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:33 crc kubenswrapper[4775]: I0127 11:22:33.744104 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:33 crc kubenswrapper[4775]: E0127 11:22:33.744621 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:34 crc kubenswrapper[4775]: I0127 11:22:34.744182 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:34 crc kubenswrapper[4775]: I0127 11:22:34.744243 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:34 crc kubenswrapper[4775]: E0127 11:22:34.744358 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:34 crc kubenswrapper[4775]: E0127 11:22:34.744574 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:34 crc kubenswrapper[4775]: I0127 11:22:34.744200 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:34 crc kubenswrapper[4775]: E0127 11:22:34.744894 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:35 crc kubenswrapper[4775]: I0127 11:22:35.744056 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:35 crc kubenswrapper[4775]: I0127 11:22:35.744417 4775 scope.go:117] "RemoveContainer" containerID="750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298" Jan 27 11:22:35 crc kubenswrapper[4775]: E0127 11:22:35.744384 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:36 crc kubenswrapper[4775]: I0127 11:22:36.457955 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/1.log" Jan 27 11:22:36 crc kubenswrapper[4775]: I0127 11:22:36.458032 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerStarted","Data":"bcc243e4b73c14109c2dd74058668508df08b94a8ab3ccb4e2fac0e77e263f09"} Jan 27 11:22:36 crc kubenswrapper[4775]: I0127 11:22:36.744404 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:36 crc kubenswrapper[4775]: I0127 11:22:36.744507 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:36 crc kubenswrapper[4775]: E0127 11:22:36.744650 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:36 crc kubenswrapper[4775]: E0127 11:22:36.744780 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:36 crc kubenswrapper[4775]: I0127 11:22:36.745943 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:36 crc kubenswrapper[4775]: E0127 11:22:36.746311 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:36 crc kubenswrapper[4775]: E0127 11:22:36.862371 4775 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 11:22:37 crc kubenswrapper[4775]: I0127 11:22:37.744260 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:37 crc kubenswrapper[4775]: E0127 11:22:37.744520 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:38 crc kubenswrapper[4775]: I0127 11:22:38.744537 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:38 crc kubenswrapper[4775]: I0127 11:22:38.744566 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:38 crc kubenswrapper[4775]: I0127 11:22:38.744641 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:38 crc kubenswrapper[4775]: E0127 11:22:38.744753 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:38 crc kubenswrapper[4775]: E0127 11:22:38.744894 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:38 crc kubenswrapper[4775]: E0127 11:22:38.745049 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:39 crc kubenswrapper[4775]: I0127 11:22:39.744349 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:39 crc kubenswrapper[4775]: E0127 11:22:39.744593 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:40 crc kubenswrapper[4775]: I0127 11:22:40.744717 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:40 crc kubenswrapper[4775]: I0127 11:22:40.744795 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:40 crc kubenswrapper[4775]: I0127 11:22:40.744817 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:40 crc kubenswrapper[4775]: E0127 11:22:40.744906 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 11:22:40 crc kubenswrapper[4775]: E0127 11:22:40.745073 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 11:22:40 crc kubenswrapper[4775]: E0127 11:22:40.745252 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 11:22:41 crc kubenswrapper[4775]: I0127 11:22:41.744843 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:41 crc kubenswrapper[4775]: E0127 11:22:41.746789 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-b48nk" podUID="c945c8b1-655c-4522-b703-0c5b9b8fcf38" Jan 27 11:22:42 crc kubenswrapper[4775]: I0127 11:22:42.744190 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:42 crc kubenswrapper[4775]: I0127 11:22:42.744292 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:42 crc kubenswrapper[4775]: I0127 11:22:42.744371 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:42 crc kubenswrapper[4775]: I0127 11:22:42.747744 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 27 11:22:42 crc kubenswrapper[4775]: I0127 11:22:42.747895 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 27 11:22:42 crc kubenswrapper[4775]: I0127 11:22:42.748179 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 27 11:22:42 crc kubenswrapper[4775]: I0127 11:22:42.748712 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 27 11:22:43 crc kubenswrapper[4775]: I0127 11:22:43.744121 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:22:43 crc kubenswrapper[4775]: I0127 11:22:43.747011 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 27 11:22:43 crc kubenswrapper[4775]: I0127 11:22:43.747440 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 27 11:22:43 crc kubenswrapper[4775]: I0127 11:22:43.827503 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.899019 4775 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.944082 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-sknjj"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.944706 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.945156 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.945579 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.950840 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.951760 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.951839 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.953163 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.953682 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.954962 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.955001 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.955102 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.955505 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.955929 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.957304 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.957490 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.957530 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.960045 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pr8gf"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.960920 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.964062 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qcw27"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.965061 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.974557 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zcbc6"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.976239 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pg564"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.976704 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.977272 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.977429 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.977626 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.977677 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.977905 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-7bkr9"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.977968 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978209 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978510 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7bkr9" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978672 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-etcd-serving-ca\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978738 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978236 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979085 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978741 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-audit-dir\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979178 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0bdc0fe8-51ba-4939-9220-5f45a846f997-machine-approver-tls\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979203 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jg66\" (UniqueName: \"kubernetes.io/projected/67761d7d-66a6-4808-803a-bf68ae3186a6-kube-api-access-6jg66\") pod \"cluster-samples-operator-665b6dd947-q9whj\" (UID: \"67761d7d-66a6-4808-803a-bf68ae3186a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979226 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3e034909-37ed-4437-a799-daf81cbe8241-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979252 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0acb956-caf6-4999-bc3b-02c0195fe7ad-serving-cert\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979273 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m89cw\" (UniqueName: \"kubernetes.io/projected/0bdc0fe8-51ba-4939-9220-5f45a846f997-kube-api-access-m89cw\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979294 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e034909-37ed-4437-a799-daf81cbe8241-serving-cert\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979314 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpgss\" (UniqueName: \"kubernetes.io/projected/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-kube-api-access-lpgss\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979348 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-config\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979384 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979410 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljpnn\" (UniqueName: \"kubernetes.io/projected/e1b6882d-984d-432b-b3df-101a6437371b-kube-api-access-ljpnn\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979418 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979436 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b6882d-984d-432b-b3df-101a6437371b-serving-cert\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979479 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0bdc0fe8-51ba-4939-9220-5f45a846f997-auth-proxy-config\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979502 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-encryption-config\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979525 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-audit\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979544 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-etcd-client\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978283 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979565 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-image-import-ca\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979590 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-images\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979615 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-node-pullsecrets\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978383 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979641 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979666 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-config\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978434 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979695 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-serving-cert\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979718 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bdc0fe8-51ba-4939-9220-5f45a846f997-config\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979740 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-config\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978844 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979765 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc9r5\" (UniqueName: \"kubernetes.io/projected/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-kube-api-access-wc9r5\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979791 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979804 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-client-ca\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978931 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979835 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-service-ca-bundle\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.979865 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jchqn\" (UniqueName: \"kubernetes.io/projected/d0acb956-caf6-4999-bc3b-02c0195fe7ad-kube-api-access-jchqn\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.978943 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.980414 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-hj8rf"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.980722 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/67761d7d-66a6-4808-803a-bf68ae3186a6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-q9whj\" (UID: \"67761d7d-66a6-4808-803a-bf68ae3186a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.980754 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlg6m\" (UniqueName: \"kubernetes.io/projected/3e034909-37ed-4437-a799-daf81cbe8241-kube-api-access-mlg6m\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.980807 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.980838 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-config\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.980868 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.980958 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.981072 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.985686 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.986032 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jl5cc"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.986611 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.986804 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.987087 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.987153 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.987842 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.988187 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.988262 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl"] Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.988467 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.988695 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.988796 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.988988 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.989261 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.989481 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.989592 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.989751 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.989845 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.990034 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.993610 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.993748 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 11:22:44 crc kubenswrapper[4775]: I0127 11:22:44.996431 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:44.999421 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-r4wxp"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:44.999761 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.000071 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.000519 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-l7rtf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.000869 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.001202 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b7lls"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.001524 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.002658 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-97tsz"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.002959 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.003502 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.004069 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.004368 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.004497 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.004639 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.004833 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.004895 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.005705 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.006192 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.011924 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.013187 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.013624 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.014406 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.015697 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.016278 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.016859 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.017047 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.017347 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.019864 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.020109 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.020546 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.021198 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.021369 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.021522 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.021553 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.015803 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.022358 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.022523 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.023805 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.024217 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.024532 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.026273 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.024541 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.027295 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.025084 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-p6jjk"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.034027 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.034440 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.034718 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.035045 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.035361 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.035892 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.035932 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.036818 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.049558 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.050011 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.050375 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.051695 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.052307 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.052488 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.053876 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.055033 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.055108 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.055322 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.055707 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.055841 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.055981 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.056171 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.056281 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.056352 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.056834 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.056900 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.057241 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.057289 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.057067 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.057122 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.057168 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.057215 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.057230 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.058144 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.058314 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.058650 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.058746 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.059190 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.060400 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.060580 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.060995 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.065264 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.066737 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.066820 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.067357 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.068487 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9m7rd"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.072339 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.072688 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.073900 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.074717 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.075760 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.079527 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-krl46"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.080272 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.090580 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.091690 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.092586 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.092868 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc9r5\" (UniqueName: \"kubernetes.io/projected/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-kube-api-access-wc9r5\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095108 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095148 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095537 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-client-ca\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095720 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-service-ca-bundle\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095766 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jchqn\" (UniqueName: \"kubernetes.io/projected/d0acb956-caf6-4999-bc3b-02c0195fe7ad-kube-api-access-jchqn\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095798 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/67761d7d-66a6-4808-803a-bf68ae3186a6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-q9whj\" (UID: \"67761d7d-66a6-4808-803a-bf68ae3186a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095832 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlg6m\" (UniqueName: \"kubernetes.io/projected/3e034909-37ed-4437-a799-daf81cbe8241-kube-api-access-mlg6m\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095870 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095907 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjxvr\" (UniqueName: \"kubernetes.io/projected/86325a44-a87c-4898-90ce-1d402f969d3a-kube-api-access-sjxvr\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095941 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8lkh\" (UniqueName: \"kubernetes.io/projected/e7329644-12a0-4c3e-8a2a-2c38a7b78369-kube-api-access-b8lkh\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.095977 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-config\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096007 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096033 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-encryption-config\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096063 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-etcd-serving-ca\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096091 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-serving-cert\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096127 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-audit-dir\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096187 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096216 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0bdc0fe8-51ba-4939-9220-5f45a846f997-machine-approver-tls\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096242 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxf7k\" (UniqueName: \"kubernetes.io/projected/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-kube-api-access-gxf7k\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096274 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096304 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jg66\" (UniqueName: \"kubernetes.io/projected/67761d7d-66a6-4808-803a-bf68ae3186a6-kube-api-access-6jg66\") pod \"cluster-samples-operator-665b6dd947-q9whj\" (UID: \"67761d7d-66a6-4808-803a-bf68ae3186a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096333 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3e034909-37ed-4437-a799-daf81cbe8241-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096359 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7329644-12a0-4c3e-8a2a-2c38a7b78369-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096392 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0acb956-caf6-4999-bc3b-02c0195fe7ad-serving-cert\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096423 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m89cw\" (UniqueName: \"kubernetes.io/projected/0bdc0fe8-51ba-4939-9220-5f45a846f997-kube-api-access-m89cw\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096472 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e034909-37ed-4437-a799-daf81cbe8241-serving-cert\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096499 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpgss\" (UniqueName: \"kubernetes.io/projected/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-kube-api-access-lpgss\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096524 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66668\" (UniqueName: \"kubernetes.io/projected/9ad82a99-23f4-4f61-9fa9-535b29e11fc3-kube-api-access-66668\") pod \"downloads-7954f5f757-7bkr9\" (UID: \"9ad82a99-23f4-4f61-9fa9-535b29e11fc3\") " pod="openshift-console/downloads-7954f5f757-7bkr9" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096556 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-config\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096561 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-client-ca\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096626 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096655 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljpnn\" (UniqueName: \"kubernetes.io/projected/e1b6882d-984d-432b-b3df-101a6437371b-kube-api-access-ljpnn\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096677 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-audit-policies\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096706 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7329644-12a0-4c3e-8a2a-2c38a7b78369-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096732 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096766 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b6882d-984d-432b-b3df-101a6437371b-serving-cert\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096813 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0bdc0fe8-51ba-4939-9220-5f45a846f997-auth-proxy-config\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096833 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-encryption-config\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096855 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-audit\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096876 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-etcd-client\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096900 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-image-import-ca\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096924 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-etcd-client\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096947 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-images\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096974 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-node-pullsecrets\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.096996 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86325a44-a87c-4898-90ce-1d402f969d3a-audit-dir\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097021 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097039 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-config\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097058 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-config\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097081 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-client-ca\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097102 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-serving-cert\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097124 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bdc0fe8-51ba-4939-9220-5f45a846f997-config\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097139 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-config\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097159 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-serving-cert\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.097440 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-config\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.098141 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-service-ca-bundle\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.098974 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.099270 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-node-pullsecrets\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.100075 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.100540 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3e034909-37ed-4437-a799-daf81cbe8241-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.100958 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0acb956-caf6-4999-bc3b-02c0195fe7ad-config\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.103640 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-etcd-serving-ca\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.104376 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/67761d7d-66a6-4808-803a-bf68ae3186a6-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-q9whj\" (UID: \"67761d7d-66a6-4808-803a-bf68ae3186a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.105376 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-images\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.105544 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bdc0fe8-51ba-4939-9220-5f45a846f997-config\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.105730 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-audit\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.105784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/0bdc0fe8-51ba-4939-9220-5f45a846f997-machine-approver-tls\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.106349 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0bdc0fe8-51ba-4939-9220-5f45a846f997-auth-proxy-config\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.106376 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.106357 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-audit-dir\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.106697 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-image-import-ca\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.107016 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-etcd-client\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.108113 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-config\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.108490 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-encryption-config\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.111164 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-config\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.112009 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-serving-cert\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.115482 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e034909-37ed-4437-a799-daf81cbe8241-serving-cert\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.118704 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.119244 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.120779 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.121762 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.122418 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mks6w"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.126043 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0acb956-caf6-4999-bc3b-02c0195fe7ad-serving-cert\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.127712 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.128000 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b6882d-984d-432b-b3df-101a6437371b-serving-cert\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.128335 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.129302 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.129601 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.130530 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.130575 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.131093 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.132147 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.133837 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-sknjj"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.136355 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.137616 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pr8gf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.138650 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.139584 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zcbc6"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.140472 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-dqrtf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.141359 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.141426 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-l7rtf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.142686 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.143385 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9m7rd"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.144331 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qcw27"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.145236 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.146191 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.147536 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.148023 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.148768 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.155276 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pg564"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.157251 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.159787 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.162897 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.166224 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7bkr9"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.167976 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.171266 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-p6jjk"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.171878 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.174840 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jl5cc"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.177050 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hj8rf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.178732 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cnwdf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.179221 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.180657 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.181894 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.183301 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b7lls"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.184579 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.186869 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.187143 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.188974 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-r4wxp"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.190637 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cnwdf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.191797 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-krl46"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.193247 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.194503 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.195620 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mks6w"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.196982 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.197946 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjxvr\" (UniqueName: \"kubernetes.io/projected/86325a44-a87c-4898-90ce-1d402f969d3a-kube-api-access-sjxvr\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.197971 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8lkh\" (UniqueName: \"kubernetes.io/projected/e7329644-12a0-4c3e-8a2a-2c38a7b78369-kube-api-access-b8lkh\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.197995 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-encryption-config\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198017 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-signing-cabundle\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198036 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-serving-cert\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198051 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-config\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198070 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfeddd59-a473-4baa-83d8-4bba68575acb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198103 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198120 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxf7k\" (UniqueName: \"kubernetes.io/projected/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-kube-api-access-gxf7k\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198137 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198160 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7329644-12a0-4c3e-8a2a-2c38a7b78369-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198187 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66668\" (UniqueName: \"kubernetes.io/projected/9ad82a99-23f4-4f61-9fa9-535b29e11fc3-kube-api-access-66668\") pod \"downloads-7954f5f757-7bkr9\" (UID: \"9ad82a99-23f4-4f61-9fa9-535b29e11fc3\") " pod="openshift-console/downloads-7954f5f757-7bkr9" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198219 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-signing-key\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198259 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt8vn\" (UniqueName: \"kubernetes.io/projected/70e56eaf-e2b2-4431-988a-e39e37012771-kube-api-access-pt8vn\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198319 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-audit-policies\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198346 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7329644-12a0-4c3e-8a2a-2c38a7b78369-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198374 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dfeddd59-a473-4baa-83d8-4bba68575acb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198409 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70e56eaf-e2b2-4431-988a-e39e37012771-serving-cert\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198439 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-etcd-client\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198495 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86325a44-a87c-4898-90ce-1d402f969d3a-audit-dir\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198522 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-config\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198549 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-service-ca\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198571 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-client\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198586 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198598 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-client-ca\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198653 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198691 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvpl2\" (UniqueName: \"kubernetes.io/projected/dfeddd59-a473-4baa-83d8-4bba68575acb-kube-api-access-bvpl2\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198734 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-serving-cert\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.198815 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfeddd59-a473-4baa-83d8-4bba68575acb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.199318 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-audit-policies\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.199424 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9wfl\" (UniqueName: \"kubernetes.io/projected/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-kube-api-access-v9wfl\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.199484 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-ca\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.199568 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86325a44-a87c-4898-90ce-1d402f969d3a-audit-dir\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.199568 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-client-ca\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.199718 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-config\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.199748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/86325a44-a87c-4898-90ce-1d402f969d3a-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.200993 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.201137 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-etcd-client\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.201182 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-encryption-config\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.201629 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86325a44-a87c-4898-90ce-1d402f969d3a-serving-cert\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.202003 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dqrtf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.203292 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.204783 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-wn6qf"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.205429 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.205976 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-serving-cert\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.207314 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.227216 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.255498 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.260017 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w97mp"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.262512 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.265022 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w97mp"] Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.267390 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.287630 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.300666 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-signing-key\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.300753 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt8vn\" (UniqueName: \"kubernetes.io/projected/70e56eaf-e2b2-4431-988a-e39e37012771-kube-api-access-pt8vn\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.300803 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dfeddd59-a473-4baa-83d8-4bba68575acb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.300822 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70e56eaf-e2b2-4431-988a-e39e37012771-serving-cert\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.300844 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-service-ca\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.300929 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-client\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.300970 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvpl2\" (UniqueName: \"kubernetes.io/projected/dfeddd59-a473-4baa-83d8-4bba68575acb-kube-api-access-bvpl2\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.301009 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfeddd59-a473-4baa-83d8-4bba68575acb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.301047 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9wfl\" (UniqueName: \"kubernetes.io/projected/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-kube-api-access-v9wfl\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.301067 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-ca\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.301100 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-signing-cabundle\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.301119 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-config\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.301138 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfeddd59-a473-4baa-83d8-4bba68575acb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.301562 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-service-ca\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.302043 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-config\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.302125 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfeddd59-a473-4baa-83d8-4bba68575acb-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.302189 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-ca\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.303933 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/70e56eaf-e2b2-4431-988a-e39e37012771-etcd-client\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.304157 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70e56eaf-e2b2-4431-988a-e39e37012771-serving-cert\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.305239 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfeddd59-a473-4baa-83d8-4bba68575acb-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.307363 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.327469 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.347608 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.367480 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.388185 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.407499 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.427652 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.447760 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.467812 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.487754 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.507340 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.528055 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.547749 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.567252 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.587794 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.593403 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7329644-12a0-4c3e-8a2a-2c38a7b78369-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.608751 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.627643 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.647566 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.651439 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7329644-12a0-4c3e-8a2a-2c38a7b78369-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.668047 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.687985 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.708604 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.728374 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.748113 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.767409 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.788510 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.808366 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.827943 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.848079 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.867916 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.894110 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.907537 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.928177 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.947903 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.967694 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 27 11:22:45 crc kubenswrapper[4775]: I0127 11:22:45.988479 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.007790 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.027360 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.047757 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.066315 4775 request.go:700] Waited for 1.006672158s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-665b6dd947-q9whj Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.087640 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.107763 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.128100 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.149035 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.168386 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.187535 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.207831 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.214035 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-signing-cabundle\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.229392 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.248500 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.267718 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.279335 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-signing-key\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.290530 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.308096 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.328829 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.355343 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.367979 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.387576 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.407474 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.428647 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.448022 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.468217 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.530255 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.548129 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.553618 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc9r5\" (UniqueName: \"kubernetes.io/projected/f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd-kube-api-access-wc9r5\") pod \"machine-api-operator-5694c8668f-sknjj\" (UID: \"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.569612 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.588269 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.608969 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.654799 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljpnn\" (UniqueName: \"kubernetes.io/projected/e1b6882d-984d-432b-b3df-101a6437371b-kube-api-access-ljpnn\") pod \"controller-manager-879f6c89f-pg564\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.663410 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jchqn\" (UniqueName: \"kubernetes.io/projected/d0acb956-caf6-4999-bc3b-02c0195fe7ad-kube-api-access-jchqn\") pod \"authentication-operator-69f744f599-pr8gf\" (UID: \"d0acb956-caf6-4999-bc3b-02c0195fe7ad\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.705929 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlg6m\" (UniqueName: \"kubernetes.io/projected/3e034909-37ed-4437-a799-daf81cbe8241-kube-api-access-mlg6m\") pod \"openshift-config-operator-7777fb866f-qcw27\" (UID: \"3e034909-37ed-4437-a799-daf81cbe8241\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.725024 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m89cw\" (UniqueName: \"kubernetes.io/projected/0bdc0fe8-51ba-4939-9220-5f45a846f997-kube-api-access-m89cw\") pod \"machine-approver-56656f9798-n9gfl\" (UID: \"0bdc0fe8-51ba-4939-9220-5f45a846f997\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.726714 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jg66\" (UniqueName: \"kubernetes.io/projected/67761d7d-66a6-4808-803a-bf68ae3186a6-kube-api-access-6jg66\") pod \"cluster-samples-operator-665b6dd947-q9whj\" (UID: \"67761d7d-66a6-4808-803a-bf68ae3186a6\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.745126 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpgss\" (UniqueName: \"kubernetes.io/projected/c13ee778-6aa2-4c33-92f6-1bddaadc2f82-kube-api-access-lpgss\") pod \"apiserver-76f77b778f-zcbc6\" (UID: \"c13ee778-6aa2-4c33-92f6-1bddaadc2f82\") " pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.747920 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.768091 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.781582 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.787717 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.808655 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.827978 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.846824 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.848835 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.868385 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.868967 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 27 11:22:46 crc kubenswrapper[4775]: W0127 11:22:46.886236 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bdc0fe8_51ba_4939_9220_5f45a846f997.slice/crio-12f97c8fda8b0cea361867806196057567967e376da54523dd191fca955d2cf0 WatchSource:0}: Error finding container 12f97c8fda8b0cea361867806196057567967e376da54523dd191fca955d2cf0: Status 404 returned error can't find the container with id 12f97c8fda8b0cea361867806196057567967e376da54523dd191fca955d2cf0 Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.887432 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.892192 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.913169 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.928894 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.944054 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.949787 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.950196 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.967994 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.970343 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:46 crc kubenswrapper[4775]: I0127 11:22:46.988925 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.011828 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.048081 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66668\" (UniqueName: \"kubernetes.io/projected/9ad82a99-23f4-4f61-9fa9-535b29e11fc3-kube-api-access-66668\") pod \"downloads-7954f5f757-7bkr9\" (UID: \"9ad82a99-23f4-4f61-9fa9-535b29e11fc3\") " pod="openshift-console/downloads-7954f5f757-7bkr9" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.066703 4775 request.go:700] Waited for 1.867298117s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/serviceaccounts/route-controller-manager-sa/token Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.071656 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjxvr\" (UniqueName: \"kubernetes.io/projected/86325a44-a87c-4898-90ce-1d402f969d3a-kube-api-access-sjxvr\") pod \"apiserver-7bbb656c7d-m7xvw\" (UID: \"86325a44-a87c-4898-90ce-1d402f969d3a\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.090707 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxf7k\" (UniqueName: \"kubernetes.io/projected/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-kube-api-access-gxf7k\") pod \"route-controller-manager-6576b87f9c-ssb2w\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.100842 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.108385 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.109934 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8lkh\" (UniqueName: \"kubernetes.io/projected/e7329644-12a0-4c3e-8a2a-2c38a7b78369-kube-api-access-b8lkh\") pod \"kube-storage-version-migrator-operator-b67b599dd-675rv\" (UID: \"e7329644-12a0-4c3e-8a2a-2c38a7b78369\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.134580 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.135747 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-sknjj"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.148272 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.153585 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-pr8gf"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.168933 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.188180 4775 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.207693 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.221371 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pg564"] Jan 27 11:22:47 crc kubenswrapper[4775]: W0127 11:22:47.234569 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1b6882d_984d_432b_b3df_101a6437371b.slice/crio-f1d7f91efbd16850b79ed6c4723629965776aad4a43a007e3ed55d3f13cef28e WatchSource:0}: Error finding container f1d7f91efbd16850b79ed6c4723629965776aad4a43a007e3ed55d3f13cef28e: Status 404 returned error can't find the container with id f1d7f91efbd16850b79ed6c4723629965776aad4a43a007e3ed55d3f13cef28e Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.247012 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dfeddd59-a473-4baa-83d8-4bba68575acb-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.256112 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qcw27"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.261853 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt8vn\" (UniqueName: \"kubernetes.io/projected/70e56eaf-e2b2-4431-988a-e39e37012771-kube-api-access-pt8vn\") pod \"etcd-operator-b45778765-r4wxp\" (UID: \"70e56eaf-e2b2-4431-988a-e39e37012771\") " pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.262689 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7bkr9" Jan 27 11:22:47 crc kubenswrapper[4775]: W0127 11:22:47.265898 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e034909_37ed_4437_a799_daf81cbe8241.slice/crio-6d82b495c125c06cefa89d85b491c1e520f93cd0cb9adfa7e560b7eb20d1f7fc WatchSource:0}: Error finding container 6d82b495c125c06cefa89d85b491c1e520f93cd0cb9adfa7e560b7eb20d1f7fc: Status 404 returned error can't find the container with id 6d82b495c125c06cefa89d85b491c1e520f93cd0cb9adfa7e560b7eb20d1f7fc Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.283878 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.283963 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.287228 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvpl2\" (UniqueName: \"kubernetes.io/projected/dfeddd59-a473-4baa-83d8-4bba68575acb-kube-api-access-bvpl2\") pod \"cluster-image-registry-operator-dc59b4c8b-6qbxf\" (UID: \"dfeddd59-a473-4baa-83d8-4bba68575acb\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.304412 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9wfl\" (UniqueName: \"kubernetes.io/projected/fe363a11-e8c8-4b4d-8401-25ba48ff00e0-kube-api-access-v9wfl\") pod \"service-ca-9c57cc56f-9m7rd\" (UID: \"fe363a11-e8c8-4b4d-8401-25ba48ff00e0\") " pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.322506 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.334100 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338130 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1dcaba9-07f4-405b-97bf-4575b0edacc5-proxy-tls\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338172 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338194 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338217 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6glzt\" (UniqueName: \"kubernetes.io/projected/a9987fd7-5b35-449c-b24a-a38afb77db17-kube-api-access-6glzt\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338239 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91f00de9-b734-4644-9164-b4b6c990aeb3-metrics-tls\") pod \"dns-operator-744455d44c-l7rtf\" (UID: \"91f00de9-b734-4644-9164-b4b6c990aeb3\") " pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338260 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d90473f1-e47f-453c-bbe4-52e528e160de-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338351 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-tls\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338482 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbwzf\" (UniqueName: \"kubernetes.io/projected/27ef9f09-90fd-490f-a8b6-912a84eb05c5-kube-api-access-vbwzf\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338539 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3655cf31-d392-485f-ba8c-13ccddbe46e1-config\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338573 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-serving-cert\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338611 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-trusted-ca\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338629 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6zlg\" (UniqueName: \"kubernetes.io/projected/91f00de9-b734-4644-9164-b4b6c990aeb3-kube-api-access-j6zlg\") pod \"dns-operator-744455d44c-l7rtf\" (UID: \"91f00de9-b734-4644-9164-b4b6c990aeb3\") " pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338647 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a042536-1621-4dae-8564-a3de61645643-trusted-ca\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.338677 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05079674-b89f-4310-98f0-b39caf8f6189-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339057 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebadab77-f881-4ec4-937f-eef9a677edfe-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339093 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6kdj\" (UniqueName: \"kubernetes.io/projected/65491a7a-a22b-4993-aef2-42e752143efd-kube-api-access-d6kdj\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339116 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339492 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339647 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339673 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-config\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339711 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65491a7a-a22b-4993-aef2-42e752143efd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339728 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-config\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339744 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339815 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/87a94d4a-7341-4e6c-8194-a2e6832dbb01-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gl7ql\" (UID: \"87a94d4a-7341-4e6c-8194-a2e6832dbb01\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339835 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339894 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5rhh\" (UniqueName: \"kubernetes.io/projected/4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61-kube-api-access-w5rhh\") pod \"migrator-59844c95c7-z9rvc\" (UID: \"4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339919 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d90473f1-e47f-453c-bbe4-52e528e160de-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339969 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.339990 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-policies\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340006 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-oauth-config\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340060 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q8jd\" (UniqueName: \"kubernetes.io/projected/68158dce-8840-47f8-8dac-37abc28edc74-kube-api-access-4q8jd\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340077 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1dcaba9-07f4-405b-97bf-4575b0edacc5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340112 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdpxd\" (UniqueName: \"kubernetes.io/projected/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-kube-api-access-gdpxd\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340133 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340149 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340195 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04906ea0-5e8b-4e8b-8f20-c46587da8346-secret-volume\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340212 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebadab77-f881-4ec4-937f-eef9a677edfe-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340228 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/65491a7a-a22b-4993-aef2-42e752143efd-images\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340284 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3655cf31-d392-485f-ba8c-13ccddbe46e1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340302 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05079674-b89f-4310-98f0-b39caf8f6189-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340317 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340369 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-metrics-certs\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340393 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlvts\" (UniqueName: \"kubernetes.io/projected/d90473f1-e47f-453c-bbe4-52e528e160de-kube-api-access-qlvts\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340427 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05079674-b89f-4310-98f0-b39caf8f6189-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340475 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04906ea0-5e8b-4e8b-8f20-c46587da8346-config-volume\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340501 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9987fd7-5b35-449c-b24a-a38afb77db17-service-ca-bundle\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340521 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-oauth-serving-cert\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340587 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-trusted-ca\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340608 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp9bb\" (UniqueName: \"kubernetes.io/projected/87a94d4a-7341-4e6c-8194-a2e6832dbb01-kube-api-access-zp9bb\") pod \"control-plane-machine-set-operator-78cbb6b69f-gl7ql\" (UID: \"87a94d4a-7341-4e6c-8194-a2e6832dbb01\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340641 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-stats-auth\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.340679 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvbnl\" (UniqueName: \"kubernetes.io/projected/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-kube-api-access-qvbnl\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.341847 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-default-certificate\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.341880 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.341897 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6dt8\" (UniqueName: \"kubernetes.io/projected/2a042536-1621-4dae-8564-a3de61645643-kube-api-access-m6dt8\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.342439 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjv2j\" (UniqueName: \"kubernetes.io/projected/04906ea0-5e8b-4e8b-8f20-c46587da8346-kube-api-access-vjv2j\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.342500 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.342523 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3655cf31-d392-485f-ba8c-13ccddbe46e1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.342581 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-serving-cert\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.342846 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.342875 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-trusted-ca-bundle\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.342938 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343234 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343352 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jt5p\" (UniqueName: \"kubernetes.io/projected/b1dcaba9-07f4-405b-97bf-4575b0edacc5-kube-api-access-8jt5p\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343388 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a042536-1621-4dae-8564-a3de61645643-metrics-tls\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.343435 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:47.843421865 +0000 UTC m=+146.985019642 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343480 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343516 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-bound-sa-token\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343606 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-service-ca\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343624 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-dir\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.343962 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.344005 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-config\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.344037 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-certificates\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.344068 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-629ps\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-kube-api-access-629ps\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.344099 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a042536-1621-4dae-8564-a3de61645643-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.344146 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65491a7a-a22b-4993-aef2-42e752143efd-proxy-tls\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.344759 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsdfw\" (UniqueName: \"kubernetes.io/projected/ebadab77-f881-4ec4-937f-eef9a677edfe-kube-api-access-jsdfw\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.391113 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.445988 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446240 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5rhh\" (UniqueName: \"kubernetes.io/projected/4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61-kube-api-access-w5rhh\") pod \"migrator-59844c95c7-z9rvc\" (UID: \"4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446270 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d90473f1-e47f-453c-bbe4-52e528e160de-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446299 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z57dm\" (UniqueName: \"kubernetes.io/projected/27d889ae-fa92-40b8-800d-d61fb92d618d-kube-api-access-z57dm\") pod \"multus-admission-controller-857f4d67dd-mks6w\" (UID: \"27d889ae-fa92-40b8-800d-d61fb92d618d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446317 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03843cd3-d8c8-4007-b9d5-c1d2254c1677-srv-cert\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446334 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-policies\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446355 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446380 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-oauth-config\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446397 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q8jd\" (UniqueName: \"kubernetes.io/projected/68158dce-8840-47f8-8dac-37abc28edc74-kube-api-access-4q8jd\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446415 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446433 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446476 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qksnf\" (UniqueName: \"kubernetes.io/projected/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-kube-api-access-qksnf\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446514 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1dcaba9-07f4-405b-97bf-4575b0edacc5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446535 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdpxd\" (UniqueName: \"kubernetes.io/projected/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-kube-api-access-gdpxd\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446559 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04906ea0-5e8b-4e8b-8f20-c46587da8346-secret-volume\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446582 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebadab77-f881-4ec4-937f-eef9a677edfe-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446604 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/65491a7a-a22b-4993-aef2-42e752143efd-images\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446635 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3655cf31-d392-485f-ba8c-13ccddbe46e1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446657 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05079674-b89f-4310-98f0-b39caf8f6189-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446696 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-metrics-certs\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446717 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446740 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mszc7\" (UniqueName: \"kubernetes.io/projected/5cf81fd9-7041-48eb-acff-470663fc9987-kube-api-access-mszc7\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446774 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04906ea0-5e8b-4e8b-8f20-c46587da8346-config-volume\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446815 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9987fd7-5b35-449c-b24a-a38afb77db17-service-ca-bundle\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446838 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlvts\" (UniqueName: \"kubernetes.io/projected/d90473f1-e47f-453c-bbe4-52e528e160de-kube-api-access-qlvts\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446861 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05079674-b89f-4310-98f0-b39caf8f6189-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446911 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-oauth-serving-cert\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446946 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxl9x\" (UniqueName: \"kubernetes.io/projected/64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9-kube-api-access-rxl9x\") pod \"ingress-canary-cnwdf\" (UID: \"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9\") " pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446970 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-trusted-ca\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.446993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp9bb\" (UniqueName: \"kubernetes.io/projected/87a94d4a-7341-4e6c-8194-a2e6832dbb01-kube-api-access-zp9bb\") pod \"control-plane-machine-set-operator-78cbb6b69f-gl7ql\" (UID: \"87a94d4a-7341-4e6c-8194-a2e6832dbb01\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447015 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a3e92f-cb64-4857-8e1a-4da128f94f55-config-volume\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447048 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-stats-auth\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447071 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvbnl\" (UniqueName: \"kubernetes.io/projected/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-kube-api-access-qvbnl\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447094 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9632ab24-73c1-4940-a642-482850dc4fe4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zwtbz\" (UID: \"9632ab24-73c1-4940-a642-482850dc4fe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447129 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-default-certificate\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447148 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/72a39a16-e53a-42b6-a71f-35d74ef633b6-certs\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447176 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447200 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6dt8\" (UniqueName: \"kubernetes.io/projected/2a042536-1621-4dae-8564-a3de61645643-kube-api-access-m6dt8\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447236 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjv2j\" (UniqueName: \"kubernetes.io/projected/04906ea0-5e8b-4e8b-8f20-c46587da8346-kube-api-access-vjv2j\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447261 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c5daf300-a879-408f-a78a-c70b0e77f54c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447283 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-socket-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447307 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447327 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-registration-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447355 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-serving-cert\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447374 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3655cf31-d392-485f-ba8c-13ccddbe46e1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447396 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5cf81fd9-7041-48eb-acff-470663fc9987-apiservice-cert\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447421 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447437 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-trusted-ca-bundle\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447509 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447533 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/27d889ae-fa92-40b8-800d-d61fb92d618d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mks6w\" (UID: \"27d889ae-fa92-40b8-800d-d61fb92d618d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447555 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a042536-1621-4dae-8564-a3de61645643-metrics-tls\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447600 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jt5p\" (UniqueName: \"kubernetes.io/projected/b1dcaba9-07f4-405b-97bf-4575b0edacc5-kube-api-access-8jt5p\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447623 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447645 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-csi-data-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447668 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-dir\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447716 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-bound-sa-token\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447745 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-service-ca\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447769 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98b6h\" (UniqueName: \"kubernetes.io/projected/9632ab24-73c1-4940-a642-482850dc4fe4-kube-api-access-98b6h\") pod \"package-server-manager-789f6589d5-zwtbz\" (UID: \"9632ab24-73c1-4940-a642-482850dc4fe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447793 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c5daf300-a879-408f-a78a-c70b0e77f54c-srv-cert\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447833 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-config\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447860 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447884 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-certificates\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447909 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-629ps\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-kube-api-access-629ps\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447932 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a042536-1621-4dae-8564-a3de61645643-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447954 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5cf81fd9-7041-48eb-acff-470663fc9987-tmpfs\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.447983 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65491a7a-a22b-4993-aef2-42e752143efd-proxy-tls\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448006 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-config\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448018 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448032 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsdfw\" (UniqueName: \"kubernetes.io/projected/ebadab77-f881-4ec4-937f-eef9a677edfe-kube-api-access-jsdfw\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448056 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvstm\" (UniqueName: \"kubernetes.io/projected/72a39a16-e53a-42b6-a71f-35d74ef633b6-kube-api-access-vvstm\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448083 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qthbw\" (UniqueName: \"kubernetes.io/projected/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-kube-api-access-qthbw\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448105 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448133 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448156 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgp8q\" (UniqueName: \"kubernetes.io/projected/c5daf300-a879-408f-a78a-c70b0e77f54c-kube-api-access-bgp8q\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448268 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1dcaba9-07f4-405b-97bf-4575b0edacc5-proxy-tls\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448306 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6glzt\" (UniqueName: \"kubernetes.io/projected/a9987fd7-5b35-449c-b24a-a38afb77db17-kube-api-access-6glzt\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448332 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl4j7\" (UniqueName: \"kubernetes.io/projected/06a3e92f-cb64-4857-8e1a-4da128f94f55-kube-api-access-jl4j7\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448352 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-tls\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448369 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91f00de9-b734-4644-9164-b4b6c990aeb3-metrics-tls\") pod \"dns-operator-744455d44c-l7rtf\" (UID: \"91f00de9-b734-4644-9164-b4b6c990aeb3\") " pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448387 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d90473f1-e47f-453c-bbe4-52e528e160de-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448404 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3655cf31-d392-485f-ba8c-13ccddbe46e1-config\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448420 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-serving-cert\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448440 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbwzf\" (UniqueName: \"kubernetes.io/projected/27ef9f09-90fd-490f-a8b6-912a84eb05c5-kube-api-access-vbwzf\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.448602 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:47.948572543 +0000 UTC m=+147.090170420 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.448974 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-serving-cert\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449008 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-trusted-ca\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449032 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5cf81fd9-7041-48eb-acff-470663fc9987-webhook-cert\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449054 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a042536-1621-4dae-8564-a3de61645643-trusted-ca\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449076 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05079674-b89f-4310-98f0-b39caf8f6189-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449114 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6zlg\" (UniqueName: \"kubernetes.io/projected/91f00de9-b734-4644-9164-b4b6c990aeb3-kube-api-access-j6zlg\") pod \"dns-operator-744455d44c-l7rtf\" (UID: \"91f00de9-b734-4644-9164-b4b6c990aeb3\") " pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449150 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-plugins-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449190 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebadab77-f881-4ec4-937f-eef9a677edfe-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449509 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3655cf31-d392-485f-ba8c-13ccddbe46e1-config\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.450281 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.450312 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-dir\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.450728 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452336 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/65491a7a-a22b-4993-aef2-42e752143efd-images\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.449217 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452719 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9-cert\") pod \"ingress-canary-cnwdf\" (UID: \"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9\") " pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-trusted-ca-bundle\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452777 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6kdj\" (UniqueName: \"kubernetes.io/projected/65491a7a-a22b-4993-aef2-42e752143efd-kube-api-access-d6kdj\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452802 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452818 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452868 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06a3e92f-cb64-4857-8e1a-4da128f94f55-metrics-tls\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452884 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d90473f1-e47f-453c-bbe4-52e528e160de-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.452929 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/03843cd3-d8c8-4007-b9d5-c1d2254c1677-profile-collector-cert\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453027 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453165 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453189 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-mountpoint-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453206 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/72a39a16-e53a-42b6-a71f-35d74ef633b6-node-bootstrap-token\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453211 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-oauth-serving-cert\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453226 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-config\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453248 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453244 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453267 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65491a7a-a22b-4993-aef2-42e752143efd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453284 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-config\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453326 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/87a94d4a-7341-4e6c-8194-a2e6832dbb01-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gl7ql\" (UID: \"87a94d4a-7341-4e6c-8194-a2e6832dbb01\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453348 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453366 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghc2h\" (UniqueName: \"kubernetes.io/projected/03843cd3-d8c8-4007-b9d5-c1d2254c1677-kube-api-access-ghc2h\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453492 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-trusted-ca\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.453919 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05079674-b89f-4310-98f0-b39caf8f6189-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.454126 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9987fd7-5b35-449c-b24a-a38afb77db17-service-ca-bundle\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.454638 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.454707 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-config\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.455628 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-serving-cert\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.455683 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-service-ca\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.455808 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-oauth-config\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.456176 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2a042536-1621-4dae-8564-a3de61645643-trusted-ca\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.457222 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ebadab77-f881-4ec4-937f-eef9a677edfe-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.457724 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/65491a7a-a22b-4993-aef2-42e752143efd-proxy-tls\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.458418 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b1dcaba9-07f4-405b-97bf-4575b0edacc5-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.458913 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04906ea0-5e8b-4e8b-8f20-c46587da8346-config-volume\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.459568 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-config\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.460657 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/65491a7a-a22b-4993-aef2-42e752143efd-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.461092 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-stats-auth\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.461885 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2a042536-1621-4dae-8564-a3de61645643-metrics-tls\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.462156 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05079674-b89f-4310-98f0-b39caf8f6189-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.462755 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3655cf31-d392-485f-ba8c-13ccddbe46e1-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.466185 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.466488 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04906ea0-5e8b-4e8b-8f20-c46587da8346-secret-volume\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.466544 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-metrics-certs\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.466979 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.468641 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.469540 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-policies\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.470028 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebadab77-f881-4ec4-937f-eef9a677edfe-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.470364 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zcbc6"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.470787 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-trusted-ca\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.471054 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-config\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.471229 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d90473f1-e47f-453c-bbe4-52e528e160de-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.472438 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.472961 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/91f00de9-b734-4644-9164-b4b6c990aeb3-metrics-tls\") pod \"dns-operator-744455d44c-l7rtf\" (UID: \"91f00de9-b734-4644-9164-b4b6c990aeb3\") " pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.473108 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-certificates\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.473126 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b1dcaba9-07f4-405b-97bf-4575b0edacc5-proxy-tls\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.473269 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.473401 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.473567 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a9987fd7-5b35-449c-b24a-a38afb77db17-default-certificate\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.473569 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.473792 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-serving-cert\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.474127 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.474817 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-tls\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.477918 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/87a94d4a-7341-4e6c-8194-a2e6832dbb01-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-gl7ql\" (UID: \"87a94d4a-7341-4e6c-8194-a2e6832dbb01\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.478671 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.481397 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7bkr9"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.482391 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5rhh\" (UniqueName: \"kubernetes.io/projected/4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61-kube-api-access-w5rhh\") pod \"migrator-59844c95c7-z9rvc\" (UID: \"4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" Jan 27 11:22:47 crc kubenswrapper[4775]: W0127 11:22:47.484425 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc13ee778_6aa2_4c33_92f6_1bddaadc2f82.slice/crio-a91870d7acb2e479eee9bb28808ce3ec64c0a74b2d611d0b8ea0faf7bea28d30 WatchSource:0}: Error finding container a91870d7acb2e479eee9bb28808ce3ec64c0a74b2d611d0b8ea0faf7bea28d30: Status 404 returned error can't find the container with id a91870d7acb2e479eee9bb28808ce3ec64c0a74b2d611d0b8ea0faf7bea28d30 Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.507994 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/02e25ab4-d6d1-40f7-8c8c-3920620cfb98-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-bxr5c\" (UID: \"02e25ab4-d6d1-40f7-8c8c-3920620cfb98\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.508269 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" event={"ID":"c13ee778-6aa2-4c33-92f6-1bddaadc2f82","Type":"ContainerStarted","Data":"a91870d7acb2e479eee9bb28808ce3ec64c0a74b2d611d0b8ea0faf7bea28d30"} Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.509726 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" event={"ID":"3e034909-37ed-4437-a799-daf81cbe8241","Type":"ContainerStarted","Data":"6d82b495c125c06cefa89d85b491c1e520f93cd0cb9adfa7e560b7eb20d1f7fc"} Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.510610 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" event={"ID":"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd","Type":"ContainerStarted","Data":"1034cc44d2c89563511470e31f3454cf45e16be2f2cd722989c87d82c6930bfd"} Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.511464 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" event={"ID":"0bdc0fe8-51ba-4939-9220-5f45a846f997","Type":"ContainerStarted","Data":"12f97c8fda8b0cea361867806196057567967e376da54523dd191fca955d2cf0"} Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.513560 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" event={"ID":"e1b6882d-984d-432b-b3df-101a6437371b","Type":"ContainerStarted","Data":"f1d7f91efbd16850b79ed6c4723629965776aad4a43a007e3ed55d3f13cef28e"} Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.514684 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7bkr9" event={"ID":"9ad82a99-23f4-4f61-9fa9-535b29e11fc3","Type":"ContainerStarted","Data":"fad1c9b987afda1f535dc920d67215cf88eaea92a38410980f3a3c65e1d900df"} Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.517372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" event={"ID":"d0acb956-caf6-4999-bc3b-02c0195fe7ad","Type":"ContainerStarted","Data":"ec2b93bd5af82503b510050e9eebe4a74b0576f88a8cab3ce9b60a530649a349"} Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.525185 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjv2j\" (UniqueName: \"kubernetes.io/projected/04906ea0-5e8b-4e8b-8f20-c46587da8346-kube-api-access-vjv2j\") pod \"collect-profiles-29491875-pj2rv\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.554650 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.555304 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.555336 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5cf81fd9-7041-48eb-acff-470663fc9987-apiservice-cert\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.555360 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/27d889ae-fa92-40b8-800d-d61fb92d618d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mks6w\" (UID: \"27d889ae-fa92-40b8-800d-d61fb92d618d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.555394 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-csi-data-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.555438 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98b6h\" (UniqueName: \"kubernetes.io/projected/9632ab24-73c1-4940-a642-482850dc4fe4-kube-api-access-98b6h\") pod \"package-server-manager-789f6589d5-zwtbz\" (UID: \"9632ab24-73c1-4940-a642-482850dc4fe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.556357 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.056342062 +0000 UTC m=+147.197939829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.557323 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp9bb\" (UniqueName: \"kubernetes.io/projected/87a94d4a-7341-4e6c-8194-a2e6832dbb01-kube-api-access-zp9bb\") pod \"control-plane-machine-set-operator-78cbb6b69f-gl7ql\" (UID: \"87a94d4a-7341-4e6c-8194-a2e6832dbb01\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.557430 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-csi-data-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.561191 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5cf81fd9-7041-48eb-acff-470663fc9987-apiservice-cert\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564551 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c5daf300-a879-408f-a78a-c70b0e77f54c-srv-cert\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564623 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5cf81fd9-7041-48eb-acff-470663fc9987-tmpfs\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564659 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-config\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564702 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvstm\" (UniqueName: \"kubernetes.io/projected/72a39a16-e53a-42b6-a71f-35d74ef633b6-kube-api-access-vvstm\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564740 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qthbw\" (UniqueName: \"kubernetes.io/projected/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-kube-api-access-qthbw\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564764 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgp8q\" (UniqueName: \"kubernetes.io/projected/c5daf300-a879-408f-a78a-c70b0e77f54c-kube-api-access-bgp8q\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564803 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl4j7\" (UniqueName: \"kubernetes.io/projected/06a3e92f-cb64-4857-8e1a-4da128f94f55-kube-api-access-jl4j7\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564847 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-serving-cert\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564872 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5cf81fd9-7041-48eb-acff-470663fc9987-webhook-cert\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564923 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-plugins-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564946 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9-cert\") pod \"ingress-canary-cnwdf\" (UID: \"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9\") " pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.564985 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06a3e92f-cb64-4857-8e1a-4da128f94f55-metrics-tls\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565047 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/03843cd3-d8c8-4007-b9d5-c1d2254c1677-profile-collector-cert\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565075 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-mountpoint-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565098 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/72a39a16-e53a-42b6-a71f-35d74ef633b6-node-bootstrap-token\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565135 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghc2h\" (UniqueName: \"kubernetes.io/projected/03843cd3-d8c8-4007-b9d5-c1d2254c1677-kube-api-access-ghc2h\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565168 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z57dm\" (UniqueName: \"kubernetes.io/projected/27d889ae-fa92-40b8-800d-d61fb92d618d-kube-api-access-z57dm\") pod \"multus-admission-controller-857f4d67dd-mks6w\" (UID: \"27d889ae-fa92-40b8-800d-d61fb92d618d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565202 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03843cd3-d8c8-4007-b9d5-c1d2254c1677-srv-cert\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565243 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qksnf\" (UniqueName: \"kubernetes.io/projected/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-kube-api-access-qksnf\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565297 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mszc7\" (UniqueName: \"kubernetes.io/projected/5cf81fd9-7041-48eb-acff-470663fc9987-kube-api-access-mszc7\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565335 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxl9x\" (UniqueName: \"kubernetes.io/projected/64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9-kube-api-access-rxl9x\") pod \"ingress-canary-cnwdf\" (UID: \"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9\") " pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565373 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a3e92f-cb64-4857-8e1a-4da128f94f55-config-volume\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565417 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9632ab24-73c1-4940-a642-482850dc4fe4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zwtbz\" (UID: \"9632ab24-73c1-4940-a642-482850dc4fe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565441 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/72a39a16-e53a-42b6-a71f-35d74ef633b6-certs\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565568 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c5daf300-a879-408f-a78a-c70b0e77f54c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565612 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-socket-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565637 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-registration-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.565982 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-registration-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.571742 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-mountpoint-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.572284 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/5cf81fd9-7041-48eb-acff-470663fc9987-tmpfs\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.573818 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c5daf300-a879-408f-a78a-c70b0e77f54c-srv-cert\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.574814 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-config\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.575608 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a3e92f-cb64-4857-8e1a-4da128f94f55-config-volume\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.576587 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9-cert\") pod \"ingress-canary-cnwdf\" (UID: \"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9\") " pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.576824 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-plugins-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.577920 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/72a39a16-e53a-42b6-a71f-35d74ef633b6-certs\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.577946 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-serving-cert\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.578118 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-socket-dir\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.579129 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06a3e92f-cb64-4857-8e1a-4da128f94f55-metrics-tls\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.579382 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c5daf300-a879-408f-a78a-c70b0e77f54c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.581705 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/27d889ae-fa92-40b8-800d-d61fb92d618d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mks6w\" (UID: \"27d889ae-fa92-40b8-800d-d61fb92d618d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.583965 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9632ab24-73c1-4940-a642-482850dc4fe4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zwtbz\" (UID: \"9632ab24-73c1-4940-a642-482850dc4fe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.584814 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03843cd3-d8c8-4007-b9d5-c1d2254c1677-srv-cert\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.586720 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvbnl\" (UniqueName: \"kubernetes.io/projected/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-kube-api-access-qvbnl\") pod \"console-f9d7485db-hj8rf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.587382 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/03843cd3-d8c8-4007-b9d5-c1d2254c1677-profile-collector-cert\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.588277 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5cf81fd9-7041-48eb-acff-470663fc9987-webhook-cert\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.588880 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/72a39a16-e53a-42b6-a71f-35d74ef633b6-node-bootstrap-token\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.590526 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.596207 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6dt8\" (UniqueName: \"kubernetes.io/projected/2a042536-1621-4dae-8564-a3de61645643-kube-api-access-m6dt8\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.608358 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsdfw\" (UniqueName: \"kubernetes.io/projected/ebadab77-f881-4ec4-937f-eef9a677edfe-kube-api-access-jsdfw\") pod \"openshift-apiserver-operator-796bbdcf4f-dnr7b\" (UID: \"ebadab77-f881-4ec4-937f-eef9a677edfe\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.644680 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2a042536-1621-4dae-8564-a3de61645643-bound-sa-token\") pod \"ingress-operator-5b745b69d9-z2n8m\" (UID: \"2a042536-1621-4dae-8564-a3de61645643\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.648395 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbwzf\" (UniqueName: \"kubernetes.io/projected/27ef9f09-90fd-490f-a8b6-912a84eb05c5-kube-api-access-vbwzf\") pod \"oauth-openshift-558db77b4-jl5cc\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.668210 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.668461 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.168402327 +0000 UTC m=+147.310000104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.668940 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.669665 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.169650161 +0000 UTC m=+147.311247938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.684760 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.684878 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q8jd\" (UniqueName: \"kubernetes.io/projected/68158dce-8840-47f8-8dac-37abc28edc74-kube-api-access-4q8jd\") pod \"marketplace-operator-79b997595-krl46\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.691254 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6glzt\" (UniqueName: \"kubernetes.io/projected/a9987fd7-5b35-449c-b24a-a38afb77db17-kube-api-access-6glzt\") pod \"router-default-5444994796-97tsz\" (UID: \"a9987fd7-5b35-449c-b24a-a38afb77db17\") " pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.695705 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-r4wxp"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.701334 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.706346 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jt5p\" (UniqueName: \"kubernetes.io/projected/b1dcaba9-07f4-405b-97bf-4575b0edacc5-kube-api-access-8jt5p\") pod \"machine-config-controller-84d6567774-wdzzd\" (UID: \"b1dcaba9-07f4-405b-97bf-4575b0edacc5\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.706548 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.723152 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-bound-sa-token\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.729964 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.747153 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3655cf31-d392-485f-ba8c-13ccddbe46e1-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-tzdxl\" (UID: \"3655cf31-d392-485f-ba8c-13ccddbe46e1\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.758586 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.759973 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.764514 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.767679 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.769598 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6zlg\" (UniqueName: \"kubernetes.io/projected/91f00de9-b734-4644-9164-b4b6c990aeb3-kube-api-access-j6zlg\") pod \"dns-operator-744455d44c-l7rtf\" (UID: \"91f00de9-b734-4644-9164-b4b6c990aeb3\") " pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.769750 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.769761 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.770076 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.270061189 +0000 UTC m=+147.411658966 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.794509 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6kdj\" (UniqueName: \"kubernetes.io/projected/65491a7a-a22b-4993-aef2-42e752143efd-kube-api-access-d6kdj\") pod \"machine-config-operator-74547568cd-qsnl5\" (UID: \"65491a7a-a22b-4993-aef2-42e752143efd\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:47 crc kubenswrapper[4775]: W0127 11:22:47.800164 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfeddd59_a473_4baa_83d8_4bba68575acb.slice/crio-befc71d40c1756016d877f0b4807667f7991943d51ae262e4d12a4c542405a3e WatchSource:0}: Error finding container befc71d40c1756016d877f0b4807667f7991943d51ae262e4d12a4c542405a3e: Status 404 returned error can't find the container with id befc71d40c1756016d877f0b4807667f7991943d51ae262e4d12a4c542405a3e Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.801038 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdpxd\" (UniqueName: \"kubernetes.io/projected/ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e-kube-api-access-gdpxd\") pod \"console-operator-58897d9998-p6jjk\" (UID: \"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e\") " pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:47 crc kubenswrapper[4775]: W0127 11:22:47.812079 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86325a44_a87c_4898_90ce_1d402f969d3a.slice/crio-164243c46dbf6527fba9f04ce91bb37f7ec8592fbafb3a549555d49d2d3567f1 WatchSource:0}: Error finding container 164243c46dbf6527fba9f04ce91bb37f7ec8592fbafb3a549555d49d2d3567f1: Status 404 returned error can't find the container with id 164243c46dbf6527fba9f04ce91bb37f7ec8592fbafb3a549555d49d2d3567f1 Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.833574 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-629ps\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-kube-api-access-629ps\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.838985 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/05079674-b89f-4310-98f0-b39caf8f6189-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lnt7p\" (UID: \"05079674-b89f-4310-98f0-b39caf8f6189\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.863107 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlvts\" (UniqueName: \"kubernetes.io/projected/d90473f1-e47f-453c-bbe4-52e528e160de-kube-api-access-qlvts\") pod \"openshift-controller-manager-operator-756b6f6bc6-8nkgc\" (UID: \"d90473f1-e47f-453c-bbe4-52e528e160de\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.874208 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.874735 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.374718974 +0000 UTC m=+147.516316751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.886242 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hj8rf"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.896990 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.907717 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98b6h\" (UniqueName: \"kubernetes.io/projected/9632ab24-73c1-4940-a642-482850dc4fe4-kube-api-access-98b6h\") pod \"package-server-manager-789f6589d5-zwtbz\" (UID: \"9632ab24-73c1-4940-a642-482850dc4fe4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.917634 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.923349 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl4j7\" (UniqueName: \"kubernetes.io/projected/06a3e92f-cb64-4857-8e1a-4da128f94f55-kube-api-access-jl4j7\") pod \"dns-default-dqrtf\" (UID: \"06a3e92f-cb64-4857-8e1a-4da128f94f55\") " pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.940114 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.941409 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.946949 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgp8q\" (UniqueName: \"kubernetes.io/projected/c5daf300-a879-408f-a78a-c70b0e77f54c-kube-api-access-bgp8q\") pod \"olm-operator-6b444d44fb-pdkpw\" (UID: \"c5daf300-a879-408f-a78a-c70b0e77f54c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.961214 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.963344 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.965508 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-9m7rd"] Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.967492 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghc2h\" (UniqueName: \"kubernetes.io/projected/03843cd3-d8c8-4007-b9d5-c1d2254c1677-kube-api-access-ghc2h\") pod \"catalog-operator-68c6474976-9s82p\" (UID: \"03843cd3-d8c8-4007-b9d5-c1d2254c1677\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.973673 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.974860 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.975003 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.474981249 +0000 UTC m=+147.616579026 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.975093 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:47 crc kubenswrapper[4775]: E0127 11:22:47.975373 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.475363969 +0000 UTC m=+147.616961746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:47 crc kubenswrapper[4775]: I0127 11:22:47.993886 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvstm\" (UniqueName: \"kubernetes.io/projected/72a39a16-e53a-42b6-a71f-35d74ef633b6-kube-api-access-vvstm\") pod \"machine-config-server-wn6qf\" (UID: \"72a39a16-e53a-42b6-a71f-35d74ef633b6\") " pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.003171 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mszc7\" (UniqueName: \"kubernetes.io/projected/5cf81fd9-7041-48eb-acff-470663fc9987-kube-api-access-mszc7\") pod \"packageserver-d55dfcdfc-c4826\" (UID: \"5cf81fd9-7041-48eb-acff-470663fc9987\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.018963 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.023614 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.026631 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qksnf\" (UniqueName: \"kubernetes.io/projected/8cdd4efa-6bf4-4d24-96bd-43f82afa62dd-kube-api-access-qksnf\") pod \"csi-hostpathplugin-w97mp\" (UID: \"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd\") " pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:48 crc kubenswrapper[4775]: W0127 11:22:48.032330 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7329644_12a0_4c3e_8a2a_2c38a7b78369.slice/crio-74bef40596d2132e165443b694178b9a1328d803ec94c3a5d66a8ca1cd41e652 WatchSource:0}: Error finding container 74bef40596d2132e165443b694178b9a1328d803ec94c3a5d66a8ca1cd41e652: Status 404 returned error can't find the container with id 74bef40596d2132e165443b694178b9a1328d803ec94c3a5d66a8ca1cd41e652 Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.039055 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.047051 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.048220 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxl9x\" (UniqueName: \"kubernetes.io/projected/64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9-kube-api-access-rxl9x\") pod \"ingress-canary-cnwdf\" (UID: \"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9\") " pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.061084 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.077600 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.078072 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.57805299 +0000 UTC m=+147.719650777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.083388 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.084496 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.085522 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qthbw\" (UniqueName: \"kubernetes.io/projected/dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8-kube-api-access-qthbw\") pod \"service-ca-operator-777779d784-b9z7n\" (UID: \"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.085590 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z57dm\" (UniqueName: \"kubernetes.io/projected/27d889ae-fa92-40b8-800d-d61fb92d618d-kube-api-access-z57dm\") pod \"multus-admission-controller-857f4d67dd-mks6w\" (UID: \"27d889ae-fa92-40b8-800d-d61fb92d618d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.090701 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.102137 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.111247 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.126787 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.127546 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.140037 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cnwdf" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.141712 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-wn6qf" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.167655 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.179335 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.179669 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.679656221 +0000 UTC m=+147.821253998 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.187584 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.280527 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.280718 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.780684327 +0000 UTC m=+147.922282104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.281302 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.281697 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.781686394 +0000 UTC m=+147.923284171 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.313011 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.377284 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.382749 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.383058 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.883044068 +0000 UTC m=+148.024641845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.419877 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.431296 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-krl46"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.484890 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.485601 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:48.985577885 +0000 UTC m=+148.127175742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.526088 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" event={"ID":"dfeddd59-a473-4baa-83d8-4bba68575acb","Type":"ContainerStarted","Data":"845e4a4325ac169f816929b0885225b08de69b9962a8ada57d9504a172605f09"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.526136 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" event={"ID":"dfeddd59-a473-4baa-83d8-4bba68575acb","Type":"ContainerStarted","Data":"befc71d40c1756016d877f0b4807667f7991943d51ae262e4d12a4c542405a3e"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.534858 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" event={"ID":"0bdc0fe8-51ba-4939-9220-5f45a846f997","Type":"ContainerStarted","Data":"3782a7d51010d42de22c939f82264212b57f408c651f71843c8276f0acdffd3c"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.537571 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" event={"ID":"fe363a11-e8c8-4b4d-8401-25ba48ff00e0","Type":"ContainerStarted","Data":"120ebc6129f5a64b60659c2552ba56b975cf13e595290591c82f5da98dc04421"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.540578 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wn6qf" event={"ID":"72a39a16-e53a-42b6-a71f-35d74ef633b6","Type":"ContainerStarted","Data":"6905af895f525fa2042d1bc8d6f6ef1681b832cd7095f4375386cac0df17c32f"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.542818 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" event={"ID":"04906ea0-5e8b-4e8b-8f20-c46587da8346","Type":"ContainerStarted","Data":"cf78fc6ef9d230c40aed4d7f6b98059ee501a89f8054d5de9225a945bf0f0a69"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.547098 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" event={"ID":"e7329644-12a0-4c3e-8a2a-2c38a7b78369","Type":"ContainerStarted","Data":"74bef40596d2132e165443b694178b9a1328d803ec94c3a5d66a8ca1cd41e652"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.554785 4775 generic.go:334] "Generic (PLEG): container finished" podID="c13ee778-6aa2-4c33-92f6-1bddaadc2f82" containerID="279d14e0b8e0b15a1f08b767f09272885f007922be07a491f5b709d306ea3fca" exitCode=0 Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.554841 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" event={"ID":"c13ee778-6aa2-4c33-92f6-1bddaadc2f82","Type":"ContainerDied","Data":"279d14e0b8e0b15a1f08b767f09272885f007922be07a491f5b709d306ea3fca"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.586375 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.586763 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.086738834 +0000 UTC m=+148.228336611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.588846 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.589984 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.089970682 +0000 UTC m=+148.231568459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.612861 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" event={"ID":"67761d7d-66a6-4808-803a-bf68ae3186a6","Type":"ContainerStarted","Data":"7758ca23a9febf551c373233e9ee3829dbee6201a5ca3f810754a266c30d4eba"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.612911 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" event={"ID":"67761d7d-66a6-4808-803a-bf68ae3186a6","Type":"ContainerStarted","Data":"3758b6fddd1a9e2f318035e5c51070c362b3b79635cb65be625a704fb5189664"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.612927 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" event={"ID":"67761d7d-66a6-4808-803a-bf68ae3186a6","Type":"ContainerStarted","Data":"c4a4c89a6f7e35ac292c985fdc168904c23b58849d7caa287017ed777ca9fda9"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.631152 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" event={"ID":"4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61","Type":"ContainerStarted","Data":"0de22aa185f3482beb7638a8c0058e83860be198a7a5af533355dc0f09fc3ecd"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.634306 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7bkr9" event={"ID":"9ad82a99-23f4-4f61-9fa9-535b29e11fc3","Type":"ContainerStarted","Data":"f2139e54b6395976f28e92322924d5b93701257940ce4bf65084cd4010f378eb"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.637248 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-7bkr9" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.639841 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.640890 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" event={"ID":"d0acb956-caf6-4999-bc3b-02c0195fe7ad","Type":"ContainerStarted","Data":"f8bb8a4c324a2ca2550be169f4cf6bf25439f07fa1abbaadf558c3f0b91c501a"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.642950 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" event={"ID":"86325a44-a87c-4898-90ce-1d402f969d3a","Type":"ContainerStarted","Data":"164243c46dbf6527fba9f04ce91bb37f7ec8592fbafb3a549555d49d2d3567f1"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.643108 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.644554 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" event={"ID":"b1dcaba9-07f4-405b-97bf-4575b0edacc5","Type":"ContainerStarted","Data":"59bb84c8b770468d0ee4c1b8565d8c5122a47febd8f7583356c0367f836a0f6c"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.651854 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jl5cc"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.652184 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" event={"ID":"3e034909-37ed-4437-a799-daf81cbe8241","Type":"ContainerDied","Data":"c17441b28133dd588ecc74a62fd3bd351c1021ba05ce5950a1ab5f82501d69da"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.651869 4775 generic.go:334] "Generic (PLEG): container finished" podID="3e034909-37ed-4437-a799-daf81cbe8241" containerID="c17441b28133dd588ecc74a62fd3bd351c1021ba05ce5950a1ab5f82501d69da" exitCode=0 Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.658697 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" event={"ID":"e1b6882d-984d-432b-b3df-101a6437371b","Type":"ContainerStarted","Data":"c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.659393 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.661520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" event={"ID":"70e56eaf-e2b2-4431-988a-e39e37012771","Type":"ContainerStarted","Data":"c9abe0adefd21162a37c2594665905d14a41d4a41b98c5a1c9b0c75971990171"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.666904 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" event={"ID":"68158dce-8840-47f8-8dac-37abc28edc74","Type":"ContainerStarted","Data":"139296b53cfcbab11c8831abaf6a0db6d586bb1a2b9f552fe62be0a6c6fbf343"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.669804 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hj8rf" event={"ID":"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf","Type":"ContainerStarted","Data":"152d04ae80ec3e4ea65562160c2d55c0e2688c495a74f3bc1b1fca916b3879fa"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.671794 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" event={"ID":"02e25ab4-d6d1-40f7-8c8c-3920620cfb98","Type":"ContainerStarted","Data":"10603a5cdc83798ecd347cd871e885f724cc2068ba34682eb62531aed0e55e51"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.673250 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" event={"ID":"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb","Type":"ContainerStarted","Data":"fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.673282 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" event={"ID":"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb","Type":"ContainerStarted","Data":"ee61d306bb5f6310bfe18fb9eb63cdf67c00e9b26b5cdca100d7222a8e1ec7f1"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.674222 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.675859 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" event={"ID":"87a94d4a-7341-4e6c-8194-a2e6832dbb01","Type":"ContainerStarted","Data":"e3bd8cfde438c5fb6bd9140a78ceb2157d9f455e3f32509df55aff1acf96c84c"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.679896 4775 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pg564 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.679943 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" podUID="e1b6882d-984d-432b-b3df-101a6437371b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.679901 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-7bkr9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.680199 4775 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ssb2w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.680233 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" podUID="9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.680237 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7bkr9" podUID="9ad82a99-23f4-4f61-9fa9-535b29e11fc3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.687316 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" event={"ID":"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd","Type":"ContainerStarted","Data":"5e5371f69716ce0574de0a216a303208b8a3e9005e5543c6bab7ada0112b3257"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.687357 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" event={"ID":"f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd","Type":"ContainerStarted","Data":"010f1761f2e0c7ba0b9c516f1d430eaa6808e5460d718db4cf032c917790dee6"} Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.700085 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.702359 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.202332795 +0000 UTC m=+148.343930572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.725613 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-97tsz" event={"ID":"a9987fd7-5b35-449c-b24a-a38afb77db17","Type":"ContainerStarted","Data":"c40ba767d61350abfb2743430a7155d4424a03f05620ffd40b0e7e8166716eee"} Jan 27 11:22:48 crc kubenswrapper[4775]: W0127 11:22:48.744038 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3655cf31_d392_485f_ba8c_13ccddbe46e1.slice/crio-cb62e4661fbadbdd808e2ae3f4d64e1a30f940055a981243a142d3d89147c089 WatchSource:0}: Error finding container cb62e4661fbadbdd808e2ae3f4d64e1a30f940055a981243a142d3d89147c089: Status 404 returned error can't find the container with id cb62e4661fbadbdd808e2ae3f4d64e1a30f940055a981243a142d3d89147c089 Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.800614 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p"] Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.803309 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.818563 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.306650351 +0000 UTC m=+148.448248118 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.904371 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.904699 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.404670155 +0000 UTC m=+148.546267932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:48 crc kubenswrapper[4775]: I0127 11:22:48.904892 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:48 crc kubenswrapper[4775]: E0127 11:22:48.905150 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.405138207 +0000 UTC m=+148.546735984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.007084 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.007490 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.507447077 +0000 UTC m=+148.649044854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.109372 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.109692 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.609676966 +0000 UTC m=+148.751274733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.218629 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.219186 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.719171841 +0000 UTC m=+148.860769618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.321148 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.321700 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.821688048 +0000 UTC m=+148.963285825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.364103 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.424890 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.425385 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:49.925340305 +0000 UTC m=+149.066938082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.529521 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.529814 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.029803533 +0000 UTC m=+149.171401310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.634582 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.635015 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.135001172 +0000 UTC m=+149.276598939 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.635963 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-6qbxf" podStartSLOduration=124.635947168 podStartE2EDuration="2m4.635947168s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:49.634426777 +0000 UTC m=+148.776024554" watchObservedRunningTime="2026-01-27 11:22:49.635947168 +0000 UTC m=+148.777544945" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.638093 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" podStartSLOduration=124.638085616 podStartE2EDuration="2m4.638085616s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:49.61172803 +0000 UTC m=+148.753325807" watchObservedRunningTime="2026-01-27 11:22:49.638085616 +0000 UTC m=+148.779683393" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.650122 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.669227 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-pr8gf" podStartSLOduration=124.669210472 podStartE2EDuration="2m4.669210472s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:49.664968507 +0000 UTC m=+148.806566284" watchObservedRunningTime="2026-01-27 11:22:49.669210472 +0000 UTC m=+148.810808249" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.676304 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5"] Jan 27 11:22:49 crc kubenswrapper[4775]: W0127 11:22:49.693468 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd90473f1_e47f_453c_bbe4_52e528e160de.slice/crio-ff0e762d6faa8aa13134b4ad431bdd12761f70636bd4590d362600a55c9d0450 WatchSource:0}: Error finding container ff0e762d6faa8aa13134b4ad431bdd12761f70636bd4590d362600a55c9d0450: Status 404 returned error can't find the container with id ff0e762d6faa8aa13134b4ad431bdd12761f70636bd4590d362600a55c9d0450 Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.698296 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.722917 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" podStartSLOduration=124.722903471 podStartE2EDuration="2m4.722903471s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:49.719586941 +0000 UTC m=+148.861184718" watchObservedRunningTime="2026-01-27 11:22:49.722903471 +0000 UTC m=+148.864501248" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.738678 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.739212 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.239196054 +0000 UTC m=+149.380793831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.763017 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-7bkr9" podStartSLOduration=124.762997921 podStartE2EDuration="2m4.762997921s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:49.753319008 +0000 UTC m=+148.894916785" watchObservedRunningTime="2026-01-27 11:22:49.762997921 +0000 UTC m=+148.904595688" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.786862 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-wn6qf" event={"ID":"72a39a16-e53a-42b6-a71f-35d74ef633b6","Type":"ContainerStarted","Data":"aab0402e77c95fd90d528bd770421994fbae156a077fcb6186ba11b2e8e81ce2"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.799734 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" event={"ID":"e7329644-12a0-4c3e-8a2a-2c38a7b78369","Type":"ContainerStarted","Data":"dd754d54437a8c9b48441eb6c302bf775e5bf98c9875ea10a2f0ea56c86c286e"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.809015 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mks6w"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.832802 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" event={"ID":"3e034909-37ed-4437-a799-daf81cbe8241","Type":"ContainerStarted","Data":"6019d33d0f2e16302611a14b4766602cebac6a9e68651e74db247d66645d97c6"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.833432 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.840210 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.841277 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.341250728 +0000 UTC m=+149.482848505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.846199 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" event={"ID":"d90473f1-e47f-453c-bbe4-52e528e160de","Type":"ContainerStarted","Data":"ff0e762d6faa8aa13134b4ad431bdd12761f70636bd4590d362600a55c9d0450"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.847477 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" event={"ID":"ebadab77-f881-4ec4-937f-eef9a677edfe","Type":"ContainerStarted","Data":"bd6b5f43dc9212334d387e31c23f8ad1116aba961301f1a1380ac1c649b990ea"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.847502 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" event={"ID":"ebadab77-f881-4ec4-937f-eef9a677edfe","Type":"ContainerStarted","Data":"59934139f60739147d6e6ca63fcf9b71336cc295171c236b6134faabd271905e"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.854320 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-sknjj" podStartSLOduration=124.854306622 podStartE2EDuration="2m4.854306622s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:49.850109068 +0000 UTC m=+148.991706835" watchObservedRunningTime="2026-01-27 11:22:49.854306622 +0000 UTC m=+148.995904399" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.861800 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-p6jjk"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.882078 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-dqrtf"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.917335 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-q9whj" podStartSLOduration=124.917311735 podStartE2EDuration="2m4.917311735s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:49.901220437 +0000 UTC m=+149.042818214" watchObservedRunningTime="2026-01-27 11:22:49.917311735 +0000 UTC m=+149.058909512" Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.923980 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.924033 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" event={"ID":"fe363a11-e8c8-4b4d-8401-25ba48ff00e0","Type":"ContainerStarted","Data":"6dfeb0a8789166cf9f0107eecfc21a282656d21cc7daff75f87e4dfdfd2e2fa8"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.937576 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.944796 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:49 crc kubenswrapper[4775]: E0127 11:22:49.946512 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.446495938 +0000 UTC m=+149.588093715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.947232 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" event={"ID":"3655cf31-d392-485f-ba8c-13ccddbe46e1","Type":"ContainerStarted","Data":"cb62e4661fbadbdd808e2ae3f4d64e1a30f940055a981243a142d3d89147c089"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.956888 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-l7rtf"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.975224 4775 generic.go:334] "Generic (PLEG): container finished" podID="86325a44-a87c-4898-90ce-1d402f969d3a" containerID="0c21d0379315e4e1a10c6b8d2a3707db8d5b99085db38635086825a4019e13aa" exitCode=0 Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.975306 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" event={"ID":"86325a44-a87c-4898-90ce-1d402f969d3a","Type":"ContainerDied","Data":"0c21d0379315e4e1a10c6b8d2a3707db8d5b99085db38635086825a4019e13aa"} Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.993323 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cnwdf"] Jan 27 11:22:49 crc kubenswrapper[4775]: I0127 11:22:49.993370 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hj8rf" event={"ID":"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf","Type":"ContainerStarted","Data":"94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.007034 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w97mp"] Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.032649 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" podStartSLOduration=125.032633218 podStartE2EDuration="2m5.032633218s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.009936292 +0000 UTC m=+149.151534069" watchObservedRunningTime="2026-01-27 11:22:50.032633218 +0000 UTC m=+149.174230995" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.033717 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz"] Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.046997 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.048497 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.54848294 +0000 UTC m=+149.690080717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.052833 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" event={"ID":"4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61","Type":"ContainerStarted","Data":"d534a92e7558c224edc75b37f9c15862614bac308377ab18c1a232908abe8e2d"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.056251 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" podStartSLOduration=125.05623924 podStartE2EDuration="2m5.05623924s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.054334519 +0000 UTC m=+149.195932296" watchObservedRunningTime="2026-01-27 11:22:50.05623924 +0000 UTC m=+149.197837017" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.059200 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n"] Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.076125 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" event={"ID":"0bdc0fe8-51ba-4939-9220-5f45a846f997","Type":"ContainerStarted","Data":"dc43decedabb77a620e6ce95a67fc6b5608fc67504d35af5da4dd14b39a965a4"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.100232 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-9m7rd" podStartSLOduration=125.100216226 podStartE2EDuration="2m5.100216226s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.099009283 +0000 UTC m=+149.240607060" watchObservedRunningTime="2026-01-27 11:22:50.100216226 +0000 UTC m=+149.241814003" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.120768 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" event={"ID":"2a042536-1621-4dae-8564-a3de61645643","Type":"ContainerStarted","Data":"fd5af5b8d2852bff69629b6a7e084786e4de9ab86ef5dff05021843d235893c8"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.133223 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-675rv" podStartSLOduration=125.133209972 podStartE2EDuration="2m5.133209972s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.131953438 +0000 UTC m=+149.273551215" watchObservedRunningTime="2026-01-27 11:22:50.133209972 +0000 UTC m=+149.274807739" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.144144 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" event={"ID":"27ef9f09-90fd-490f-a8b6-912a84eb05c5","Type":"ContainerStarted","Data":"505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.144188 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" event={"ID":"27ef9f09-90fd-490f-a8b6-912a84eb05c5","Type":"ContainerStarted","Data":"71f62b9e07cf144d54a44160698a6e892c6a6b7a96fbedaace452d7e78d81f2c"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.144928 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.148019 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.149196 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.649185717 +0000 UTC m=+149.790783494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.158546 4775 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jl5cc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" start-of-body= Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.158589 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" podUID="27ef9f09-90fd-490f-a8b6-912a84eb05c5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.165901 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-wn6qf" podStartSLOduration=5.165884419 podStartE2EDuration="5.165884419s" podCreationTimestamp="2026-01-27 11:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.165594222 +0000 UTC m=+149.307191999" watchObservedRunningTime="2026-01-27 11:22:50.165884419 +0000 UTC m=+149.307482186" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.167718 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" event={"ID":"87a94d4a-7341-4e6c-8194-a2e6832dbb01","Type":"ContainerStarted","Data":"96c61ba0c44a9bdd2148a58561f77800040e6336869ae782384b29c41d5da1cd"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.218185 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" event={"ID":"65491a7a-a22b-4993-aef2-42e752143efd","Type":"ContainerStarted","Data":"0533aa6edd918f14538f576e3deaf63ba4ac74bcea142e8bc767f023163401c0"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.220264 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dnr7b" podStartSLOduration=125.220253907 podStartE2EDuration="2m5.220253907s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.219821506 +0000 UTC m=+149.361419303" watchObservedRunningTime="2026-01-27 11:22:50.220253907 +0000 UTC m=+149.361851684" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.250678 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.252364 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.752326639 +0000 UTC m=+149.893924416 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.252408 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-97tsz" event={"ID":"a9987fd7-5b35-449c-b24a-a38afb77db17","Type":"ContainerStarted","Data":"24273afb2501ce305d2943100fd626f0dbe4e0d66743d1ac20b04568497aa216"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.271090 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" event={"ID":"b1dcaba9-07f4-405b-97bf-4575b0edacc5","Type":"ContainerStarted","Data":"84c9086b4443674bc142c18247c231f240564432ae9e4c00298d1b63b6118922"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.310770 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-n9gfl" podStartSLOduration=126.310747127 podStartE2EDuration="2m6.310747127s" podCreationTimestamp="2026-01-27 11:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.293106977 +0000 UTC m=+149.434704754" watchObservedRunningTime="2026-01-27 11:22:50.310747127 +0000 UTC m=+149.452344904" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.333886 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" event={"ID":"05079674-b89f-4310-98f0-b39caf8f6189","Type":"ContainerStarted","Data":"2db66dd87ec8ce293ea30d0e9e985ac04c1af6ff3ebd7a1d807aa3274f8678f4"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.343960 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-hj8rf" podStartSLOduration=125.343938969 podStartE2EDuration="2m5.343938969s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.335416138 +0000 UTC m=+149.477013925" watchObservedRunningTime="2026-01-27 11:22:50.343938969 +0000 UTC m=+149.485536736" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.353052 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.354247 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.854232799 +0000 UTC m=+149.995830576 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.371069 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" event={"ID":"70e56eaf-e2b2-4431-988a-e39e37012771","Type":"ContainerStarted","Data":"9d72bf496b407b7649878b26f66112aa99674c59d371c19c52804c420f8311c7"} Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.372514 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-7bkr9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.372583 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7bkr9" podUID="9ad82a99-23f4-4f61-9fa9-535b29e11fc3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.380992 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.390732 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" podStartSLOduration=125.3907159 podStartE2EDuration="2m5.3907159s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.387963905 +0000 UTC m=+149.529561692" watchObservedRunningTime="2026-01-27 11:22:50.3907159 +0000 UTC m=+149.532313677" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.401299 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.428891 4775 csr.go:261] certificate signing request csr-2zzsd is approved, waiting to be issued Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.448505 4775 csr.go:257] certificate signing request csr-2zzsd is issued Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.453918 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.455222 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:50.955207013 +0000 UTC m=+150.096804790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.530779 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" podStartSLOduration=125.530763246 podStartE2EDuration="2m5.530763246s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.461688549 +0000 UTC m=+149.603286326" watchObservedRunningTime="2026-01-27 11:22:50.530763246 +0000 UTC m=+149.672361013" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.558089 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.558396 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.058385397 +0000 UTC m=+150.199983174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.574996 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-gl7ql" podStartSLOduration=125.574980078 podStartE2EDuration="2m5.574980078s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.532681268 +0000 UTC m=+149.674279045" watchObservedRunningTime="2026-01-27 11:22:50.574980078 +0000 UTC m=+149.716577855" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.646003 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-97tsz" podStartSLOduration=125.645986297 podStartE2EDuration="2m5.645986297s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.638077893 +0000 UTC m=+149.779675680" watchObservedRunningTime="2026-01-27 11:22:50.645986297 +0000 UTC m=+149.787584074" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.660187 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.660617 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.160598625 +0000 UTC m=+150.302196402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.707428 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-r4wxp" podStartSLOduration=125.707413237 podStartE2EDuration="2m5.707413237s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.706288836 +0000 UTC m=+149.847886633" watchObservedRunningTime="2026-01-27 11:22:50.707413237 +0000 UTC m=+149.849011014" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.743820 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" podStartSLOduration=125.743803245 podStartE2EDuration="2m5.743803245s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.743475167 +0000 UTC m=+149.885072944" watchObservedRunningTime="2026-01-27 11:22:50.743803245 +0000 UTC m=+149.885401022" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.761962 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.762243 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.262231376 +0000 UTC m=+150.403829153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.819409 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" podStartSLOduration=125.81939292 podStartE2EDuration="2m5.81939292s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:50.817612262 +0000 UTC m=+149.959210029" watchObservedRunningTime="2026-01-27 11:22:50.81939292 +0000 UTC m=+149.960990697" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.864683 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.865004 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.364986479 +0000 UTC m=+150.506584256 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.966922 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:50 crc kubenswrapper[4775]: E0127 11:22:50.967226 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.467212927 +0000 UTC m=+150.608810704 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.977601 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.983580 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:50 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:50 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:50 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:50 crc kubenswrapper[4775]: I0127 11:22:50.983748 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.068806 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.070399 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.570381701 +0000 UTC m=+150.711979478 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.172197 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.172812 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.672784514 +0000 UTC m=+150.814382291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.273932 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.274207 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.774167079 +0000 UTC m=+150.915764856 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.274566 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.275123 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.775104605 +0000 UTC m=+150.916702382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.383263 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.383637 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.883618513 +0000 UTC m=+151.025216290 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.415705 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" event={"ID":"2a042536-1621-4dae-8564-a3de61645643","Type":"ContainerStarted","Data":"528d08f059ce3d8b2e449d87760e8588f3bea2f0cd1eecf6bd9cb4d114b6ea60"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.415758 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" event={"ID":"2a042536-1621-4dae-8564-a3de61645643","Type":"ContainerStarted","Data":"4b17f1bf2ac6e8c7c305aaea8d2a8d4fe1c6f78427701ff7bf826500091a6b39"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.439256 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" event={"ID":"c5daf300-a879-408f-a78a-c70b0e77f54c","Type":"ContainerStarted","Data":"de42c2e202f71ffbc786ebc358dd070b6c1a2e4b64da6a0f87d192361561cdad"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.439302 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" event={"ID":"c5daf300-a879-408f-a78a-c70b0e77f54c","Type":"ContainerStarted","Data":"2f1a6cbdb6ba15deef2e45e6126623f63cdccd0d29901ee45dc4a65925f5f5aa"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.440071 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.451480 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-27 11:17:50 +0000 UTC, rotation deadline is 2026-12-17 12:26:54.440365864 +0000 UTC Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.451581 4775 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7777h4m2.988788713s for next certificate rotation Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.480920 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-z2n8m" podStartSLOduration=126.480905888 podStartE2EDuration="2m6.480905888s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:51.478631016 +0000 UTC m=+150.620228793" watchObservedRunningTime="2026-01-27 11:22:51.480905888 +0000 UTC m=+150.622503665" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.485303 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.488629 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:51.988617557 +0000 UTC m=+151.130215334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.492914 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" event={"ID":"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd","Type":"ContainerStarted","Data":"43b6f8605a4e7694ccbd8aecd38778b552b9690904788d0df5f7ed221d595dd8"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.525625 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" podStartSLOduration=126.525606553 podStartE2EDuration="2m6.525606553s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:51.517833421 +0000 UTC m=+150.659431198" watchObservedRunningTime="2026-01-27 11:22:51.525606553 +0000 UTC m=+150.667204330" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.531665 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dqrtf" event={"ID":"06a3e92f-cb64-4857-8e1a-4da128f94f55","Type":"ContainerStarted","Data":"a1f77a6fb9d108229dd2b40a0bd7b77e59d4a5aa01a11175087fc6c7550fd453"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.531701 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dqrtf" event={"ID":"06a3e92f-cb64-4857-8e1a-4da128f94f55","Type":"ContainerStarted","Data":"db5eb3f5c96a180043b72a5496cb4347932089ad2fcb0cd3cbe05aa5447d84c2"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.548890 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pdkpw" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.581096 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" event={"ID":"9632ab24-73c1-4940-a642-482850dc4fe4","Type":"ContainerStarted","Data":"12374620bc7d271a79c62cb620cefbdc0f810ebc145076a7b886613bef5c69c7"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.581142 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" event={"ID":"9632ab24-73c1-4940-a642-482850dc4fe4","Type":"ContainerStarted","Data":"80b0a4e92569f9e147b3229daf7f0ffe4f2659861e0f4244c02bce07ec4ef696"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.581151 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" event={"ID":"9632ab24-73c1-4940-a642-482850dc4fe4","Type":"ContainerStarted","Data":"5a28e03b68a57f45bf360daf44711f1e6ac6505fa52536f50c1eb227f666bf14"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.581723 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.586983 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.587305 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.087289559 +0000 UTC m=+151.228887336 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.643512 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-z9rvc" event={"ID":"4d4ac5c2-22c0-4b96-8f1d-2aea539a7e61","Type":"ContainerStarted","Data":"eee1a9656c2f41d432bbf844e4977c11a157f7c30e4e81da4d63ed75788879cc"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.655731 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" podStartSLOduration=126.655714058 podStartE2EDuration="2m6.655714058s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:51.653054336 +0000 UTC m=+150.794652113" watchObservedRunningTime="2026-01-27 11:22:51.655714058 +0000 UTC m=+150.797311835" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.658206 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cnwdf" event={"ID":"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9","Type":"ContainerStarted","Data":"9b30a015543d93b1a33b63a65f0a742a246f4327aeef0d453c6e13a62cff9288"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.658250 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cnwdf" event={"ID":"64a8cc9f-e0e9-48b7-a3a9-77a0c80c16d9","Type":"ContainerStarted","Data":"4b07e67c0f4e51027b9d8114576e4d16b08e24ab63fe1e79121135e3421f5350"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.690503 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.691694 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.191681786 +0000 UTC m=+151.333279563 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.699970 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" event={"ID":"5cf81fd9-7041-48eb-acff-470663fc9987","Type":"ContainerStarted","Data":"8107c7b877227e25164b7d0d1a366a18d1fa4c97d2dc74d6468dd892a5d56cbb"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.700013 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" event={"ID":"5cf81fd9-7041-48eb-acff-470663fc9987","Type":"ContainerStarted","Data":"31ae6e7018fcf1e9a9c974ec56e22d7f27ef67f348440c9e0d446abf85ad35b7"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.700732 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.705306 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cnwdf" podStartSLOduration=7.705293656 podStartE2EDuration="7.705293656s" podCreationTimestamp="2026-01-27 11:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:51.704237667 +0000 UTC m=+150.845835444" watchObservedRunningTime="2026-01-27 11:22:51.705293656 +0000 UTC m=+150.846891423" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.706857 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" event={"ID":"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e","Type":"ContainerStarted","Data":"b209e173952f26f4dc314786df4e9b28d480467980940127fbe701e2e2c30f0a"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.706900 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" event={"ID":"ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e","Type":"ContainerStarted","Data":"913c5c74a03be31c906de368eab588682face76ab8ed48f102ae5349ad389a37"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.707732 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.713009 4775 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-c4826 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.713061 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" podUID="5cf81fd9-7041-48eb-acff-470663fc9987" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.715415 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" event={"ID":"27d889ae-fa92-40b8-800d-d61fb92d618d","Type":"ContainerStarted","Data":"da7d86930bccd9656409d12cb3e639ea3e4064717d262f6e0b687c01a82d4d7c"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.715440 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" event={"ID":"27d889ae-fa92-40b8-800d-d61fb92d618d","Type":"ContainerStarted","Data":"c7dd8f22cd9ccccf950c2cb368f466450b2b0d0d5aaf13383aeb86f052bfbf97"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.716232 4775 patch_prober.go:28] interesting pod/console-operator-58897d9998-p6jjk container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" start-of-body= Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.716259 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" podUID="ef4e507b-d5cd-4d0e-b3bd-a67dbea4788e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.26:8443/readyz\": dial tcp 10.217.0.26:8443: connect: connection refused" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.716560 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" event={"ID":"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8","Type":"ContainerStarted","Data":"36136d3a0eb9a873346fbbde8eb88c862390f091561d2a899de16ee76cc45b6d"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.716587 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" event={"ID":"dfc0d40e-e1a7-419a-b59b-0e2b67c26ee8","Type":"ContainerStarted","Data":"ad63fa9301fa0e7d96096d745998f27df39fc9132ccaa4783c77839cd75aa285"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.718277 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-wdzzd" event={"ID":"b1dcaba9-07f4-405b-97bf-4575b0edacc5","Type":"ContainerStarted","Data":"b51e6b91945a903237ff01650c0d16cfe3ac0054bc8d395893e71d66851fcdd6"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.769836 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lnt7p" event={"ID":"05079674-b89f-4310-98f0-b39caf8f6189","Type":"ContainerStarted","Data":"3c210595a983f9f135eadad00e5c5b4d9fcdde9142f1ddc7ca78135733655eeb"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.769870 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" event={"ID":"04906ea0-5e8b-4e8b-8f20-c46587da8346","Type":"ContainerStarted","Data":"bee43132c84a9e322e462c0d4b4b214665e4a0e6c90cb849008c237820eb6817"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.776191 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" podStartSLOduration=126.776178803 podStartE2EDuration="2m6.776178803s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:51.764941347 +0000 UTC m=+150.906539124" watchObservedRunningTime="2026-01-27 11:22:51.776178803 +0000 UTC m=+150.917776580" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.780224 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-tzdxl" event={"ID":"3655cf31-d392-485f-ba8c-13ccddbe46e1","Type":"ContainerStarted","Data":"20ed37cb270f3e63f0e3af77729d40f3065657e5d2390f476b3b2afaf3889e16"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.786770 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" event={"ID":"68158dce-8840-47f8-8dac-37abc28edc74","Type":"ContainerStarted","Data":"0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.787752 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.792257 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.794354 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.294331255 +0000 UTC m=+151.435929032 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.806680 4775 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-krl46 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.806738 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" podUID="68158dce-8840-47f8-8dac-37abc28edc74" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.812617 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" event={"ID":"91f00de9-b734-4644-9164-b4b6c990aeb3","Type":"ContainerStarted","Data":"b1c85aa7a3fdc901b21d61123d0c8de3b6683b00c3f63b3a77635318b27457da"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.812657 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" event={"ID":"91f00de9-b734-4644-9164-b4b6c990aeb3","Type":"ContainerStarted","Data":"37ccad49aefbf211461d72e8ca18ffca80c3d6df8cfe9ff2c445094ed5935b89"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.822305 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" event={"ID":"02e25ab4-d6d1-40f7-8c8c-3920620cfb98","Type":"ContainerStarted","Data":"a69523693d53de97ec0c0446bafc9645aa373df1a002ade301bcfdd030ff8051"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.847643 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" podStartSLOduration=126.847626104 podStartE2EDuration="2m6.847626104s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:51.845480315 +0000 UTC m=+150.987078092" watchObservedRunningTime="2026-01-27 11:22:51.847626104 +0000 UTC m=+150.989223881" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.853808 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" event={"ID":"d90473f1-e47f-453c-bbe4-52e528e160de","Type":"ContainerStarted","Data":"015f40b86d1688cba57d7c70e75205c818e9ded18ee1fff0c16f1af8420d6cb7"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.878292 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" event={"ID":"65491a7a-a22b-4993-aef2-42e752143efd","Type":"ContainerStarted","Data":"3b819ad549d3512db136f84299cb077290603b03ab7d6936e0f18fdfd4c7a772"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.878337 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" event={"ID":"65491a7a-a22b-4993-aef2-42e752143efd","Type":"ContainerStarted","Data":"ddd30e319025170daeca1ab0eef175d79bb88f40b4e3e7788b9f7d3419069f86"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.894313 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.894620 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.39460691 +0000 UTC m=+151.536204687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.898415 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" event={"ID":"c13ee778-6aa2-4c33-92f6-1bddaadc2f82","Type":"ContainerStarted","Data":"529e6bd8040e6bdb7cbb3acffbab7bc9591d02a0cc51b16f12950ee913d081ec"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.898464 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" event={"ID":"c13ee778-6aa2-4c33-92f6-1bddaadc2f82","Type":"ContainerStarted","Data":"5be4fed82fb5e89e14a93619decf3059ba064286bddf2781ef54adefa1ec4520"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.900176 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" event={"ID":"03843cd3-d8c8-4007-b9d5-c1d2254c1677","Type":"ContainerStarted","Data":"660bea006c33bdb516d36c9db325277b958f06b70d3e022444e3efefb0d887b9"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.900196 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" event={"ID":"03843cd3-d8c8-4007-b9d5-c1d2254c1677","Type":"ContainerStarted","Data":"ec136e7681689ccafbb0ad9b7c9d01830a866f7e87a5ee742bd42f1dae6c13c1"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.900745 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.920028 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" podStartSLOduration=126.920013901 podStartE2EDuration="2m6.920013901s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:51.917867073 +0000 UTC m=+151.059464850" watchObservedRunningTime="2026-01-27 11:22:51.920013901 +0000 UTC m=+151.061611678" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.922134 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" event={"ID":"86325a44-a87c-4898-90ce-1d402f969d3a","Type":"ContainerStarted","Data":"23c3b08f08e63d195f100b793de22fc9e44e05a4e327d2b6a4565f96a1e3d31c"} Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.924930 4775 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9s82p container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.924970 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" podUID="03843cd3-d8c8-4007-b9d5-c1d2254c1677" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.971043 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.972200 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:51 crc kubenswrapper[4775]: I0127 11:22:51.994951 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:51 crc kubenswrapper[4775]: E0127 11:22:51.996824 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.496807618 +0000 UTC m=+151.638405395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:51.999805 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:52 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:52 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:52 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.016026 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.106217 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.107327 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.607313701 +0000 UTC m=+151.748911478 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.136805 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-b9z7n" podStartSLOduration=127.136788972 podStartE2EDuration="2m7.136788972s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.0675069 +0000 UTC m=+151.209104677" watchObservedRunningTime="2026-01-27 11:22:52.136788972 +0000 UTC m=+151.278386749" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.184529 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" podStartSLOduration=127.184512369 podStartE2EDuration="2m7.184512369s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.137766829 +0000 UTC m=+151.279364606" watchObservedRunningTime="2026-01-27 11:22:52.184512369 +0000 UTC m=+151.326110146" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.186314 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8nkgc" podStartSLOduration=127.186306567 podStartE2EDuration="2m7.186306567s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.185673511 +0000 UTC m=+151.327271288" watchObservedRunningTime="2026-01-27 11:22:52.186306567 +0000 UTC m=+151.327904344" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.207866 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.208554 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.708539642 +0000 UTC m=+151.850137419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.283932 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.284707 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.286844 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" podStartSLOduration=127.28683579 podStartE2EDuration="2m7.28683579s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.281869605 +0000 UTC m=+151.423467392" watchObservedRunningTime="2026-01-27 11:22:52.28683579 +0000 UTC m=+151.428433567" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.309263 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.309623 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.809610729 +0000 UTC m=+151.951208506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.316679 4775 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-m7xvw container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.21:8443/livez\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.316827 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" podUID="86325a44-a87c-4898-90ce-1d402f969d3a" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.21:8443/livez\": dial tcp 10.217.0.21:8443: connect: connection refused" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.324418 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-bxr5c" podStartSLOduration=127.324405081 podStartE2EDuration="2m7.324405081s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.32289814 +0000 UTC m=+151.464495917" watchObservedRunningTime="2026-01-27 11:22:52.324405081 +0000 UTC m=+151.466002858" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.402660 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.410818 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.411325 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:52.911303882 +0000 UTC m=+152.052901659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.432223 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qsnl5" podStartSLOduration=127.43220419 podStartE2EDuration="2m7.43220419s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.403274264 +0000 UTC m=+151.544872041" watchObservedRunningTime="2026-01-27 11:22:52.43220419 +0000 UTC m=+151.573801967" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.472716 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" podStartSLOduration=127.472702231 podStartE2EDuration="2m7.472702231s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.433153397 +0000 UTC m=+151.574751184" watchObservedRunningTime="2026-01-27 11:22:52.472702231 +0000 UTC m=+151.614300008" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.512382 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.512728 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.012717359 +0000 UTC m=+152.154315136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.515946 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" podStartSLOduration=127.515930026 podStartE2EDuration="2m7.515930026s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:52.513956212 +0000 UTC m=+151.655553999" watchObservedRunningTime="2026-01-27 11:22:52.515930026 +0000 UTC m=+151.657527803" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.614763 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.614955 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.615029 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.615705 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.115679247 +0000 UTC m=+152.257277024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.616641 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.646955 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.691686 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.716734 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.716842 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.716886 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.719860 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.219839908 +0000 UTC m=+152.361437685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.721911 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.725422 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.818371 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.818728 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.318683694 +0000 UTC m=+152.460281471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.920037 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:52 crc kubenswrapper[4775]: E0127 11:22:52.920616 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.420590783 +0000 UTC m=+152.562188560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.978771 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.981263 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-dqrtf" event={"ID":"06a3e92f-cb64-4857-8e1a-4da128f94f55","Type":"ContainerStarted","Data":"9705c2b3fefc46caeeb544d7b64cdba84f896dd91fcbe3088799eebeaf53357c"} Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.981589 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-dqrtf" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.995759 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:52 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:52 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:52 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.995805 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:52 crc kubenswrapper[4775]: I0127 11:22:52.999584 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcw27" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.002286 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.021222 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.023553 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.52352934 +0000 UTC m=+152.665127117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.027310 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-dqrtf" podStartSLOduration=9.027289853 podStartE2EDuration="9.027289853s" podCreationTimestamp="2026-01-27 11:22:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:53.02239687 +0000 UTC m=+152.163994647" watchObservedRunningTime="2026-01-27 11:22:53.027289853 +0000 UTC m=+152.168887630" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.054249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" event={"ID":"91f00de9-b734-4644-9164-b4b6c990aeb3","Type":"ContainerStarted","Data":"022a60d70c8f489f04235d83dda5d9802ffca3a71bdbb75b0a68c22e8fbbba57"} Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.123982 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.124350 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.62433105 +0000 UTC m=+152.765928827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.144172 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" event={"ID":"27d889ae-fa92-40b8-800d-d61fb92d618d","Type":"ContainerStarted","Data":"21123bb550dca958669ca6ce3d94621b6e5742881b4df027084b04904262a607"} Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.175788 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-l7rtf" podStartSLOduration=128.175752958 podStartE2EDuration="2m8.175752958s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:53.095974049 +0000 UTC m=+152.237571826" watchObservedRunningTime="2026-01-27 11:22:53.175752958 +0000 UTC m=+152.317350735" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.184755 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" event={"ID":"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd","Type":"ContainerStarted","Data":"06729a94d59fe2bf7b82240ae98f53ecb463942d56a2cbc8655b6705ea1ae5f2"} Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.193915 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mks6w" podStartSLOduration=128.193900641 podStartE2EDuration="2m8.193900641s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:53.176594291 +0000 UTC m=+152.318192078" watchObservedRunningTime="2026-01-27 11:22:53.193900641 +0000 UTC m=+152.335498428" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.205231 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9s82p" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.226550 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.229904 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.729880779 +0000 UTC m=+152.871478546 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.237977 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.238298 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-p6jjk" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.348750 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.349073 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.849042817 +0000 UTC m=+152.990640584 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.450392 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.451331 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:53.951308016 +0000 UTC m=+153.092905793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.552038 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.552359 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.052346812 +0000 UTC m=+153.193944589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.655323 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.655605 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.155578048 +0000 UTC m=+153.297175825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: W0127 11:22:53.699525 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-cd9f9455686d54dc8c2a67a9032272c866352fbe2bdcacdab2bad6f70f4ec008 WatchSource:0}: Error finding container cd9f9455686d54dc8c2a67a9032272c866352fbe2bdcacdab2bad6f70f4ec008: Status 404 returned error can't find the container with id cd9f9455686d54dc8c2a67a9032272c866352fbe2bdcacdab2bad6f70f4ec008 Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.735171 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-c4826" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.762605 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.762912 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.262896564 +0000 UTC m=+153.404494341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: W0127 11:22:53.819554 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-c7eb1de85289600e7dc67dd2c61b76ad9568b725556777489850f1e005130d70 WatchSource:0}: Error finding container c7eb1de85289600e7dc67dd2c61b76ad9568b725556777489850f1e005130d70: Status 404 returned error can't find the container with id c7eb1de85289600e7dc67dd2c61b76ad9568b725556777489850f1e005130d70 Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.864728 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.865085 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.36506909 +0000 UTC m=+153.506666867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.966184 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:53 crc kubenswrapper[4775]: E0127 11:22:53.966493 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.466481607 +0000 UTC m=+153.608079384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.975555 4775 patch_prober.go:28] interesting pod/apiserver-76f77b778f-zcbc6 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]log ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]etcd ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/generic-apiserver-start-informers ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/max-in-flight-filter ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 27 11:22:53 crc kubenswrapper[4775]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 27 11:22:53 crc kubenswrapper[4775]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/project.openshift.io-projectcache ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/openshift.io-startinformers ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 27 11:22:53 crc kubenswrapper[4775]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 27 11:22:53 crc kubenswrapper[4775]: livez check failed Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.975628 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" podUID="c13ee778-6aa2-4c33-92f6-1bddaadc2f82" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.985030 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:53 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:53 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:53 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:53 crc kubenswrapper[4775]: I0127 11:22:53.985077 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.067890 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.068279 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.568262653 +0000 UTC m=+153.709860430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.169443 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.169764 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.669748501 +0000 UTC m=+153.811346268 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.194197 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" event={"ID":"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd","Type":"ContainerStarted","Data":"1cd6906c2a7f11739cfe596d63f02ade73f04a959b79aab945acb154b74a4a6f"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.194321 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" event={"ID":"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd","Type":"ContainerStarted","Data":"2b37a4e5846ffcc455f9af22719def2e84bb9449d85c1bd8d80710adfa29facd"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.195331 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1a915e56e5ae74e56c915dd2921b645a0311bb1185597fdf6c5424ab11c0820e"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.195439 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"87314c930b0861c7348bfa1221b4b44c849ad00c8621536a26eaeab571b0aae8"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.196460 4775 generic.go:334] "Generic (PLEG): container finished" podID="04906ea0-5e8b-4e8b-8f20-c46587da8346" containerID="bee43132c84a9e322e462c0d4b4b214665e4a0e6c90cb849008c237820eb6817" exitCode=0 Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.196535 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" event={"ID":"04906ea0-5e8b-4e8b-8f20-c46587da8346","Type":"ContainerDied","Data":"bee43132c84a9e322e462c0d4b4b214665e4a0e6c90cb849008c237820eb6817"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.197662 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c6f6919fe2ebe16c3cbcfaff7c1dbde15471ff43e9e94ebc6eed87bd1e425780"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.197700 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c7eb1de85289600e7dc67dd2c61b76ad9568b725556777489850f1e005130d70"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.200096 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2fd34dd2b279752de359aee68435a5a849a7bcf3be93777a43c64852b07afe88"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.200151 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cd9f9455686d54dc8c2a67a9032272c866352fbe2bdcacdab2bad6f70f4ec008"} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.270436 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.270556 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.77053373 +0000 UTC m=+153.912131497 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.270782 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.272601 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.772585195 +0000 UTC m=+153.914182972 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.321768 4775 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.327129 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s8snw"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.328024 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.329854 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.339419 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8snw"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.375313 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.375432 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.87540814 +0000 UTC m=+154.017005917 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.375570 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-catalog-content\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.375603 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.375636 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-utilities\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.376210 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-962qq\" (UniqueName: \"kubernetes.io/projected/2b487540-88bb-496a-9aff-3f383cdc858b-kube-api-access-962qq\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.376254 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.876237973 +0000 UTC m=+154.017835750 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.477741 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.477986 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-utilities\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.478044 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-962qq\" (UniqueName: \"kubernetes.io/projected/2b487540-88bb-496a-9aff-3f383cdc858b-kube-api-access-962qq\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.478081 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-catalog-content\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.478268 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:54.978238914 +0000 UTC m=+154.119836691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.478520 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-catalog-content\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.478599 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-utilities\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.502716 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-962qq\" (UniqueName: \"kubernetes.io/projected/2b487540-88bb-496a-9aff-3f383cdc858b-kube-api-access-962qq\") pod \"certified-operators-s8snw\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.523857 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vkb7p"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.524926 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.526953 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.537151 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vkb7p"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.579573 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-catalog-content\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.579645 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qsks\" (UniqueName: \"kubernetes.io/projected/f1ecb76d-1e7c-4889-ab6d-451e8b534308-kube-api-access-7qsks\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.579673 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-utilities\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.579724 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.580141 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:55.080123723 +0000 UTC m=+154.221721500 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.646305 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.681074 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.681316 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-catalog-content\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.681349 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qsks\" (UniqueName: \"kubernetes.io/projected/f1ecb76d-1e7c-4889-ab6d-451e8b534308-kube-api-access-7qsks\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.681375 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-utilities\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.681838 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-utilities\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.681915 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:55.18189829 +0000 UTC m=+154.323496067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.682104 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-catalog-content\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.698636 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qsks\" (UniqueName: \"kubernetes.io/projected/f1ecb76d-1e7c-4889-ab6d-451e8b534308-kube-api-access-7qsks\") pod \"community-operators-vkb7p\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.723667 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5kd8m"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.724602 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.738798 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5kd8m"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.784552 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-utilities\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.784600 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.784623 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-catalog-content\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.784696 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfzg9\" (UniqueName: \"kubernetes.io/projected/57a822f4-b93b-497d-bfc6-cf4f13cc8140-kube-api-access-xfzg9\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.784971 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:55.28495967 +0000 UTC m=+154.426557447 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.840121 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.888569 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.888764 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 11:22:55.388738341 +0000 UTC m=+154.530336118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.888790 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfzg9\" (UniqueName: \"kubernetes.io/projected/57a822f4-b93b-497d-bfc6-cf4f13cc8140-kube-api-access-xfzg9\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.888843 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-utilities\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.888869 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.888892 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-catalog-content\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: E0127 11:22:54.889476 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 11:22:55.38946717 +0000 UTC m=+154.531064947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-b7lls" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.889664 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-catalog-content\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.889739 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-utilities\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.908037 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfzg9\" (UniqueName: \"kubernetes.io/projected/57a822f4-b93b-497d-bfc6-cf4f13cc8140-kube-api-access-xfzg9\") pod \"certified-operators-5kd8m\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.924908 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fchbb"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.926031 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.929314 4775 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-27T11:22:54.321796003Z","Handler":null,"Name":""} Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.931638 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fchbb"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.933708 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8snw"] Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.940251 4775 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.940325 4775 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.983164 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:54 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:54 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:54 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.983597 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.989702 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.989981 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-catalog-content\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.990016 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-utilities\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.990044 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z4x5\" (UniqueName: \"kubernetes.io/projected/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-kube-api-access-6z4x5\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:54 crc kubenswrapper[4775]: I0127 11:22:54.996520 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.057773 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.094053 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-catalog-content\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.094107 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-utilities\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.094136 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z4x5\" (UniqueName: \"kubernetes.io/projected/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-kube-api-access-6z4x5\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.094223 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.094790 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-catalog-content\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.095028 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-utilities\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.103294 4775 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.103336 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.117244 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z4x5\" (UniqueName: \"kubernetes.io/projected/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-kube-api-access-6z4x5\") pod \"community-operators-fchbb\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.134093 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-b7lls\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.161542 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.216414 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" event={"ID":"8cdd4efa-6bf4-4d24-96bd-43f82afa62dd","Type":"ContainerStarted","Data":"59805f77353717ff12dee61f4aae7e868e50fe076e3edfb7635250eae886c87b"} Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.223054 4775 generic.go:334] "Generic (PLEG): container finished" podID="2b487540-88bb-496a-9aff-3f383cdc858b" containerID="d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05" exitCode=0 Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.224497 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8snw" event={"ID":"2b487540-88bb-496a-9aff-3f383cdc858b","Type":"ContainerDied","Data":"d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05"} Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.224542 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8snw" event={"ID":"2b487540-88bb-496a-9aff-3f383cdc858b","Type":"ContainerStarted","Data":"1eec3f7497774ba660fe56e1601efacc89958991dbb3752466e04ed907d8b155"} Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.228840 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.247381 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.255022 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-w97mp" podStartSLOduration=10.255001354000001 podStartE2EDuration="10.255001354s" podCreationTimestamp="2026-01-27 11:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:55.236659066 +0000 UTC m=+154.378256863" watchObservedRunningTime="2026-01-27 11:22:55.255001354 +0000 UTC m=+154.396599131" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.265413 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5kd8m"] Jan 27 11:22:55 crc kubenswrapper[4775]: W0127 11:22:55.280608 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57a822f4_b93b_497d_bfc6_cf4f13cc8140.slice/crio-290297c00f0444d9d550e5200aba7133d2e252d2cc5cddaaa7f26158bf4b0fff WatchSource:0}: Error finding container 290297c00f0444d9d550e5200aba7133d2e252d2cc5cddaaa7f26158bf4b0fff: Status 404 returned error can't find the container with id 290297c00f0444d9d550e5200aba7133d2e252d2cc5cddaaa7f26158bf4b0fff Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.319303 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vkb7p"] Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.425529 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.426162 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.428877 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.429377 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.438128 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.511160 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41348a87-6415-41fe-97a9-bcc552d7bc8e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.511223 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41348a87-6415-41fe-97a9-bcc552d7bc8e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.589619 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b7lls"] Jan 27 11:22:55 crc kubenswrapper[4775]: W0127 11:22:55.611471 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbb40aba_c103_4a72_abd7_3e5b3aaa82e5.slice/crio-8d37a2d435548adc351dbcf45235ea8b83864719085f8dffa0da9c361fa7f477 WatchSource:0}: Error finding container 8d37a2d435548adc351dbcf45235ea8b83864719085f8dffa0da9c361fa7f477: Status 404 returned error can't find the container with id 8d37a2d435548adc351dbcf45235ea8b83864719085f8dffa0da9c361fa7f477 Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.613424 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41348a87-6415-41fe-97a9-bcc552d7bc8e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.614223 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41348a87-6415-41fe-97a9-bcc552d7bc8e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.614712 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41348a87-6415-41fe-97a9-bcc552d7bc8e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.616885 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.644109 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41348a87-6415-41fe-97a9-bcc552d7bc8e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.715023 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjv2j\" (UniqueName: \"kubernetes.io/projected/04906ea0-5e8b-4e8b-8f20-c46587da8346-kube-api-access-vjv2j\") pod \"04906ea0-5e8b-4e8b-8f20-c46587da8346\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.715118 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04906ea0-5e8b-4e8b-8f20-c46587da8346-config-volume\") pod \"04906ea0-5e8b-4e8b-8f20-c46587da8346\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.715149 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04906ea0-5e8b-4e8b-8f20-c46587da8346-secret-volume\") pod \"04906ea0-5e8b-4e8b-8f20-c46587da8346\" (UID: \"04906ea0-5e8b-4e8b-8f20-c46587da8346\") " Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.716265 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04906ea0-5e8b-4e8b-8f20-c46587da8346-config-volume" (OuterVolumeSpecName: "config-volume") pod "04906ea0-5e8b-4e8b-8f20-c46587da8346" (UID: "04906ea0-5e8b-4e8b-8f20-c46587da8346"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.717324 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fchbb"] Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.718928 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04906ea0-5e8b-4e8b-8f20-c46587da8346-kube-api-access-vjv2j" (OuterVolumeSpecName: "kube-api-access-vjv2j") pod "04906ea0-5e8b-4e8b-8f20-c46587da8346" (UID: "04906ea0-5e8b-4e8b-8f20-c46587da8346"). InnerVolumeSpecName "kube-api-access-vjv2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.722781 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04906ea0-5e8b-4e8b-8f20-c46587da8346-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "04906ea0-5e8b-4e8b-8f20-c46587da8346" (UID: "04906ea0-5e8b-4e8b-8f20-c46587da8346"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:22:55 crc kubenswrapper[4775]: W0127 11:22:55.726150 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4a37ccd_52c3_49cc_8db8_1f0069dee3c3.slice/crio-6ed253dee07a34146473ff3556fd2212e703a1c328ef425adac33fe2a7fe4fa8 WatchSource:0}: Error finding container 6ed253dee07a34146473ff3556fd2212e703a1c328ef425adac33fe2a7fe4fa8: Status 404 returned error can't find the container with id 6ed253dee07a34146473ff3556fd2212e703a1c328ef425adac33fe2a7fe4fa8 Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.752344 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.759647 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.817439 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjv2j\" (UniqueName: \"kubernetes.io/projected/04906ea0-5e8b-4e8b-8f20-c46587da8346-kube-api-access-vjv2j\") on node \"crc\" DevicePath \"\"" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.817663 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/04906ea0-5e8b-4e8b-8f20-c46587da8346-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.817723 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/04906ea0-5e8b-4e8b-8f20-c46587da8346-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.978740 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:55 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:55 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:55 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.979133 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:55 crc kubenswrapper[4775]: I0127 11:22:55.981113 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 11:22:55 crc kubenswrapper[4775]: W0127 11:22:55.984757 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod41348a87_6415_41fe_97a9_bcc552d7bc8e.slice/crio-53a0190a17c4a1879bdcfb99afd1b9c66eda4e02e269fba0213b15829a2f8336 WatchSource:0}: Error finding container 53a0190a17c4a1879bdcfb99afd1b9c66eda4e02e269fba0213b15829a2f8336: Status 404 returned error can't find the container with id 53a0190a17c4a1879bdcfb99afd1b9c66eda4e02e269fba0213b15829a2f8336 Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.232845 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" event={"ID":"04906ea0-5e8b-4e8b-8f20-c46587da8346","Type":"ContainerDied","Data":"cf78fc6ef9d230c40aed4d7f6b98059ee501a89f8054d5de9225a945bf0f0a69"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.233213 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf78fc6ef9d230c40aed4d7f6b98059ee501a89f8054d5de9225a945bf0f0a69" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.233291 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.235150 4775 generic.go:334] "Generic (PLEG): container finished" podID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerID="13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689" exitCode=0 Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.235200 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fchbb" event={"ID":"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3","Type":"ContainerDied","Data":"13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.235217 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fchbb" event={"ID":"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3","Type":"ContainerStarted","Data":"6ed253dee07a34146473ff3556fd2212e703a1c328ef425adac33fe2a7fe4fa8"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.238940 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41348a87-6415-41fe-97a9-bcc552d7bc8e","Type":"ContainerStarted","Data":"53a0190a17c4a1879bdcfb99afd1b9c66eda4e02e269fba0213b15829a2f8336"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.243383 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" event={"ID":"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5","Type":"ContainerStarted","Data":"666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.243417 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" event={"ID":"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5","Type":"ContainerStarted","Data":"8d37a2d435548adc351dbcf45235ea8b83864719085f8dffa0da9c361fa7f477"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.243478 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.246801 4775 generic.go:334] "Generic (PLEG): container finished" podID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerID="e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371" exitCode=0 Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.246873 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kd8m" event={"ID":"57a822f4-b93b-497d-bfc6-cf4f13cc8140","Type":"ContainerDied","Data":"e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.247421 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kd8m" event={"ID":"57a822f4-b93b-497d-bfc6-cf4f13cc8140","Type":"ContainerStarted","Data":"290297c00f0444d9d550e5200aba7133d2e252d2cc5cddaaa7f26158bf4b0fff"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.254205 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkb7p" event={"ID":"f1ecb76d-1e7c-4889-ab6d-451e8b534308","Type":"ContainerDied","Data":"575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.254251 4775 generic.go:334] "Generic (PLEG): container finished" podID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerID="575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1" exitCode=0 Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.254529 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkb7p" event={"ID":"f1ecb76d-1e7c-4889-ab6d-451e8b534308","Type":"ContainerStarted","Data":"5e3718fa7769c29d58e7ea6f7af42eff70181f72f7af0705859deb32581a0268"} Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.295784 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" podStartSLOduration=131.295768258 podStartE2EDuration="2m11.295768258s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:22:56.292989753 +0000 UTC m=+155.434587530" watchObservedRunningTime="2026-01-27 11:22:56.295768258 +0000 UTC m=+155.437366035" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.523556 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wpqn9"] Jan 27 11:22:56 crc kubenswrapper[4775]: E0127 11:22:56.524077 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04906ea0-5e8b-4e8b-8f20-c46587da8346" containerName="collect-profiles" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.524177 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="04906ea0-5e8b-4e8b-8f20-c46587da8346" containerName="collect-profiles" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.524383 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="04906ea0-5e8b-4e8b-8f20-c46587da8346" containerName="collect-profiles" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.526134 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.527990 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.534472 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpqn9"] Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.636672 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmb78\" (UniqueName: \"kubernetes.io/projected/5415a9cc-8755-41e6-bd7b-1542339cadc6-kube-api-access-nmb78\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.636740 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-catalog-content\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.636762 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-utilities\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.738295 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmb78\" (UniqueName: \"kubernetes.io/projected/5415a9cc-8755-41e6-bd7b-1542339cadc6-kube-api-access-nmb78\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.738567 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-catalog-content\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.738602 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-utilities\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.739154 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-utilities\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.739298 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-catalog-content\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.761648 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmb78\" (UniqueName: \"kubernetes.io/projected/5415a9cc-8755-41e6-bd7b-1542339cadc6-kube-api-access-nmb78\") pod \"redhat-marketplace-wpqn9\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.854489 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.922336 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t4skp"] Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.923415 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.931325 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4skp"] Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.975021 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.978752 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:56 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:56 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:56 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.978798 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:56 crc kubenswrapper[4775]: I0127 11:22:56.979124 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-zcbc6" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.047163 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqkzd\" (UniqueName: \"kubernetes.io/projected/615aabb4-e21b-4941-ba5d-d6148cee87af-kube-api-access-fqkzd\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.047218 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-catalog-content\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.047248 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-utilities\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.151659 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqkzd\" (UniqueName: \"kubernetes.io/projected/615aabb4-e21b-4941-ba5d-d6148cee87af-kube-api-access-fqkzd\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.151706 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-catalog-content\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.151740 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-utilities\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.152149 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-utilities\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.152654 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-catalog-content\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.220334 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqkzd\" (UniqueName: \"kubernetes.io/projected/615aabb4-e21b-4941-ba5d-d6148cee87af-kube-api-access-fqkzd\") pod \"redhat-marketplace-t4skp\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.254718 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.263235 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-7bkr9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.263282 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7bkr9" podUID="9ad82a99-23f4-4f61-9fa9-535b29e11fc3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.263323 4775 patch_prober.go:28] interesting pod/downloads-7954f5f757-7bkr9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.263362 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-7bkr9" podUID="9ad82a99-23f4-4f61-9fa9-535b29e11fc3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.271289 4775 generic.go:334] "Generic (PLEG): container finished" podID="41348a87-6415-41fe-97a9-bcc552d7bc8e" containerID="bc1e9cdd7d4a6a3a9ed8e7ee85f8174c8273fced2b790ad4cf14063463698b52" exitCode=0 Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.272054 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41348a87-6415-41fe-97a9-bcc552d7bc8e","Type":"ContainerDied","Data":"bc1e9cdd7d4a6a3a9ed8e7ee85f8174c8273fced2b790ad4cf14063463698b52"} Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.287019 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpqn9"] Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.294669 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.309732 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-m7xvw" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.533606 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v5q62"] Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.535117 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.540095 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v5q62"] Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.540888 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.559928 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-catalog-content\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.559967 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7htj4\" (UniqueName: \"kubernetes.io/projected/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-kube-api-access-7htj4\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.560004 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-utilities\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.600928 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.600983 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.615044 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4skp"] Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.622199 4775 patch_prober.go:28] interesting pod/console-f9d7485db-hj8rf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.622270 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hj8rf" podUID="ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.661390 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-catalog-content\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.661428 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7htj4\" (UniqueName: \"kubernetes.io/projected/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-kube-api-access-7htj4\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.661481 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-utilities\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.662972 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-catalog-content\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.663524 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-utilities\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.680328 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7htj4\" (UniqueName: \"kubernetes.io/projected/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-kube-api-access-7htj4\") pod \"redhat-operators-v5q62\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.864408 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.925815 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lw9xz"] Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.928048 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.933884 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lw9xz"] Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.979670 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-utilities\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.979736 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-catalog-content\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.979800 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpls5\" (UniqueName: \"kubernetes.io/projected/232e2caf-d6b3-47b9-9ca0-45aec1e95045-kube-api-access-cpls5\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.979898 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.990658 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:57 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:57 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:57 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:57 crc kubenswrapper[4775]: I0127 11:22:57.990727 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.081244 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpls5\" (UniqueName: \"kubernetes.io/projected/232e2caf-d6b3-47b9-9ca0-45aec1e95045-kube-api-access-cpls5\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.081309 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-utilities\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.081351 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-catalog-content\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.082888 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-utilities\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.083041 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-catalog-content\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.107412 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpls5\" (UniqueName: \"kubernetes.io/projected/232e2caf-d6b3-47b9-9ca0-45aec1e95045-kube-api-access-cpls5\") pod \"redhat-operators-lw9xz\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.242622 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.281868 4775 generic.go:334] "Generic (PLEG): container finished" podID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerID="15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae" exitCode=0 Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.281971 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4skp" event={"ID":"615aabb4-e21b-4941-ba5d-d6148cee87af","Type":"ContainerDied","Data":"15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae"} Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.281998 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4skp" event={"ID":"615aabb4-e21b-4941-ba5d-d6148cee87af","Type":"ContainerStarted","Data":"67afb1f7244fe2812d1d4acb97266d9ec82321b9084603e7c3e8b7b7b66acb18"} Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.286172 4775 generic.go:334] "Generic (PLEG): container finished" podID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerID="b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e" exitCode=0 Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.286270 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpqn9" event={"ID":"5415a9cc-8755-41e6-bd7b-1542339cadc6","Type":"ContainerDied","Data":"b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e"} Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.286299 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpqn9" event={"ID":"5415a9cc-8755-41e6-bd7b-1542339cadc6","Type":"ContainerStarted","Data":"f7dc6e40e63c860fc724ef492981f5e211c90e6c7db158d9132d52f25b456767"} Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.381344 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v5q62"] Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.659200 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.660121 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.662488 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.662552 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.664229 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.684376 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.692122 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c496d7e-2613-433b-95bf-95257c8f2887-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.692174 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c496d7e-2613-433b-95bf-95257c8f2887-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.698916 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lw9xz"] Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.793398 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41348a87-6415-41fe-97a9-bcc552d7bc8e-kube-api-access\") pod \"41348a87-6415-41fe-97a9-bcc552d7bc8e\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.793476 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41348a87-6415-41fe-97a9-bcc552d7bc8e-kubelet-dir\") pod \"41348a87-6415-41fe-97a9-bcc552d7bc8e\" (UID: \"41348a87-6415-41fe-97a9-bcc552d7bc8e\") " Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.793586 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41348a87-6415-41fe-97a9-bcc552d7bc8e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "41348a87-6415-41fe-97a9-bcc552d7bc8e" (UID: "41348a87-6415-41fe-97a9-bcc552d7bc8e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.793809 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c496d7e-2613-433b-95bf-95257c8f2887-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.793966 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c496d7e-2613-433b-95bf-95257c8f2887-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.795132 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c496d7e-2613-433b-95bf-95257c8f2887-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.795349 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/41348a87-6415-41fe-97a9-bcc552d7bc8e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.799412 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41348a87-6415-41fe-97a9-bcc552d7bc8e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "41348a87-6415-41fe-97a9-bcc552d7bc8e" (UID: "41348a87-6415-41fe-97a9-bcc552d7bc8e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.809405 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c496d7e-2613-433b-95bf-95257c8f2887-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.897427 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/41348a87-6415-41fe-97a9-bcc552d7bc8e-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.978890 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:22:58 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:22:58 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:22:58 crc kubenswrapper[4775]: healthz check failed Jan 27 11:22:58 crc kubenswrapper[4775]: I0127 11:22:58.978954 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.006876 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.298222 4775 generic.go:334] "Generic (PLEG): container finished" podID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerID="d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196" exitCode=0 Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.298534 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5q62" event={"ID":"3ae6a7af-e7d7-440b-b7cb-366edba2d44e","Type":"ContainerDied","Data":"d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196"} Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.298559 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5q62" event={"ID":"3ae6a7af-e7d7-440b-b7cb-366edba2d44e","Type":"ContainerStarted","Data":"aada0f1adaa2b58806b9e0dc31f109b054a31ac70cb0eb0272c44c192348a37d"} Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.308572 4775 generic.go:334] "Generic (PLEG): container finished" podID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerID="8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4" exitCode=0 Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.308658 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw9xz" event={"ID":"232e2caf-d6b3-47b9-9ca0-45aec1e95045","Type":"ContainerDied","Data":"8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4"} Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.308688 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw9xz" event={"ID":"232e2caf-d6b3-47b9-9ca0-45aec1e95045","Type":"ContainerStarted","Data":"03e04acda80c448d05ec4f5110391d0366fd0fc80d319a5ee3f1ee1c23fc4573"} Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.317395 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"41348a87-6415-41fe-97a9-bcc552d7bc8e","Type":"ContainerDied","Data":"53a0190a17c4a1879bdcfb99afd1b9c66eda4e02e269fba0213b15829a2f8336"} Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.317421 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53a0190a17c4a1879bdcfb99afd1b9c66eda4e02e269fba0213b15829a2f8336" Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.317500 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 11:22:59 crc kubenswrapper[4775]: I0127 11:22:59.347321 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 11:23:00 crc kubenswrapper[4775]: I0127 11:23:00.264141 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:23:00 crc kubenswrapper[4775]: I0127 11:23:00.264216 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:23:00 crc kubenswrapper[4775]: I0127 11:23:00.273740 4775 patch_prober.go:28] interesting pod/router-default-5444994796-97tsz container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 11:23:00 crc kubenswrapper[4775]: [-]has-synced failed: reason withheld Jan 27 11:23:00 crc kubenswrapper[4775]: [+]process-running ok Jan 27 11:23:00 crc kubenswrapper[4775]: healthz check failed Jan 27 11:23:00 crc kubenswrapper[4775]: I0127 11:23:00.273817 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-97tsz" podUID="a9987fd7-5b35-449c-b24a-a38afb77db17" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 11:23:00 crc kubenswrapper[4775]: I0127 11:23:00.396514 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8c496d7e-2613-433b-95bf-95257c8f2887","Type":"ContainerStarted","Data":"615bb5ca4122491fd3622f93c7b5426d8ae936e51bba95b8311c82f0837a87e9"} Jan 27 11:23:00 crc kubenswrapper[4775]: I0127 11:23:00.977006 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:23:00 crc kubenswrapper[4775]: I0127 11:23:00.979396 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-97tsz" Jan 27 11:23:01 crc kubenswrapper[4775]: I0127 11:23:01.411731 4775 generic.go:334] "Generic (PLEG): container finished" podID="8c496d7e-2613-433b-95bf-95257c8f2887" containerID="6eb47812cd5da34ef545cda10bc22d8bfb78abac828d9aedc2550c073bd33895" exitCode=0 Jan 27 11:23:01 crc kubenswrapper[4775]: I0127 11:23:01.412584 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8c496d7e-2613-433b-95bf-95257c8f2887","Type":"ContainerDied","Data":"6eb47812cd5da34ef545cda10bc22d8bfb78abac828d9aedc2550c073bd33895"} Jan 27 11:23:02 crc kubenswrapper[4775]: I0127 11:23:02.980512 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:23:03 crc kubenswrapper[4775]: I0127 11:23:03.136123 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-dqrtf" Jan 27 11:23:05 crc kubenswrapper[4775]: I0127 11:23:05.938519 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.082785 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c496d7e-2613-433b-95bf-95257c8f2887-kubelet-dir\") pod \"8c496d7e-2613-433b-95bf-95257c8f2887\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.082887 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c496d7e-2613-433b-95bf-95257c8f2887-kube-api-access\") pod \"8c496d7e-2613-433b-95bf-95257c8f2887\" (UID: \"8c496d7e-2613-433b-95bf-95257c8f2887\") " Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.083033 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c496d7e-2613-433b-95bf-95257c8f2887-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8c496d7e-2613-433b-95bf-95257c8f2887" (UID: "8c496d7e-2613-433b-95bf-95257c8f2887"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.083361 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c496d7e-2613-433b-95bf-95257c8f2887-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.101862 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c496d7e-2613-433b-95bf-95257c8f2887-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8c496d7e-2613-433b-95bf-95257c8f2887" (UID: "8c496d7e-2613-433b-95bf-95257c8f2887"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.184176 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c496d7e-2613-433b-95bf-95257c8f2887-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.452340 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8c496d7e-2613-433b-95bf-95257c8f2887","Type":"ContainerDied","Data":"615bb5ca4122491fd3622f93c7b5426d8ae936e51bba95b8311c82f0837a87e9"} Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.452383 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="615bb5ca4122491fd3622f93c7b5426d8ae936e51bba95b8311c82f0837a87e9" Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.452437 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 11:23:06 crc kubenswrapper[4775]: I0127 11:23:06.994956 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:23:07 crc kubenswrapper[4775]: I0127 11:23:07.016570 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c945c8b1-655c-4522-b703-0c5b9b8fcf38-metrics-certs\") pod \"network-metrics-daemon-b48nk\" (UID: \"c945c8b1-655c-4522-b703-0c5b9b8fcf38\") " pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:23:07 crc kubenswrapper[4775]: I0127 11:23:07.157465 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-b48nk" Jan 27 11:23:07 crc kubenswrapper[4775]: I0127 11:23:07.276554 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-7bkr9" Jan 27 11:23:07 crc kubenswrapper[4775]: I0127 11:23:07.722846 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:23:07 crc kubenswrapper[4775]: I0127 11:23:07.727309 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:23:15 crc kubenswrapper[4775]: I0127 11:23:15.167795 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:23:19 crc kubenswrapper[4775]: E0127 11:23:19.149482 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 27 11:23:19 crc kubenswrapper[4775]: E0127 11:23:19.150031 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7qsks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vkb7p_openshift-marketplace(f1ecb76d-1e7c-4889-ab6d-451e8b534308): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 11:23:19 crc kubenswrapper[4775]: E0127 11:23:19.151246 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vkb7p" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" Jan 27 11:23:20 crc kubenswrapper[4775]: E0127 11:23:20.893228 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vkb7p" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" Jan 27 11:23:21 crc kubenswrapper[4775]: E0127 11:23:21.903380 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 27 11:23:21 crc kubenswrapper[4775]: E0127 11:23:21.904832 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nmb78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wpqn9_openshift-marketplace(5415a9cc-8755-41e6-bd7b-1542339cadc6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 11:23:21 crc kubenswrapper[4775]: E0127 11:23:21.906503 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wpqn9" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" Jan 27 11:23:27 crc kubenswrapper[4775]: E0127 11:23:27.584586 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wpqn9" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" Jan 27 11:23:27 crc kubenswrapper[4775]: E0127 11:23:27.615336 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 27 11:23:27 crc kubenswrapper[4775]: E0127 11:23:27.615590 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cpls5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lw9xz_openshift-marketplace(232e2caf-d6b3-47b9-9ca0-45aec1e95045): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 11:23:27 crc kubenswrapper[4775]: E0127 11:23:27.616772 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-lw9xz" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" Jan 27 11:23:27 crc kubenswrapper[4775]: E0127 11:23:27.624721 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 27 11:23:27 crc kubenswrapper[4775]: E0127 11:23:27.624872 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7htj4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-v5q62_openshift-marketplace(3ae6a7af-e7d7-440b-b7cb-366edba2d44e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 11:23:27 crc kubenswrapper[4775]: E0127 11:23:27.626060 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-v5q62" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.013884 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-b48nk"] Jan 27 11:23:28 crc kubenswrapper[4775]: W0127 11:23:28.050359 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc945c8b1_655c_4522_b703_0c5b9b8fcf38.slice/crio-cddb190166a58244de497325c6afc2e0460f2751730c8cab60313a4b6e9f7ed8 WatchSource:0}: Error finding container cddb190166a58244de497325c6afc2e0460f2751730c8cab60313a4b6e9f7ed8: Status 404 returned error can't find the container with id cddb190166a58244de497325c6afc2e0460f2751730c8cab60313a4b6e9f7ed8 Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.106957 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zwtbz" Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.593339 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-b48nk" event={"ID":"c945c8b1-655c-4522-b703-0c5b9b8fcf38","Type":"ContainerStarted","Data":"96a178553693bb50d5c7991838e3e488c6c839e4c6b764c6715f32a493532ed4"} Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.593868 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-b48nk" event={"ID":"c945c8b1-655c-4522-b703-0c5b9b8fcf38","Type":"ContainerStarted","Data":"bf6180b3257485dc4fb651f383e57a343ac1477cfeaf469138f42716d9d8c7b4"} Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.593903 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-b48nk" event={"ID":"c945c8b1-655c-4522-b703-0c5b9b8fcf38","Type":"ContainerStarted","Data":"cddb190166a58244de497325c6afc2e0460f2751730c8cab60313a4b6e9f7ed8"} Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.597258 4775 generic.go:334] "Generic (PLEG): container finished" podID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerID="ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b" exitCode=0 Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.597360 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fchbb" event={"ID":"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3","Type":"ContainerDied","Data":"ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b"} Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.602940 4775 generic.go:334] "Generic (PLEG): container finished" podID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerID="2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796" exitCode=0 Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.602985 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4skp" event={"ID":"615aabb4-e21b-4941-ba5d-d6148cee87af","Type":"ContainerDied","Data":"2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796"} Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.606904 4775 generic.go:334] "Generic (PLEG): container finished" podID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerID="a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575" exitCode=0 Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.606991 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kd8m" event={"ID":"57a822f4-b93b-497d-bfc6-cf4f13cc8140","Type":"ContainerDied","Data":"a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575"} Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.615357 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-b48nk" podStartSLOduration=163.615328212 podStartE2EDuration="2m43.615328212s" podCreationTimestamp="2026-01-27 11:20:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:23:28.609539115 +0000 UTC m=+187.751136912" watchObservedRunningTime="2026-01-27 11:23:28.615328212 +0000 UTC m=+187.756926009" Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.627873 4775 generic.go:334] "Generic (PLEG): container finished" podID="2b487540-88bb-496a-9aff-3f383cdc858b" containerID="af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6" exitCode=0 Jan 27 11:23:28 crc kubenswrapper[4775]: I0127 11:23:28.628643 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8snw" event={"ID":"2b487540-88bb-496a-9aff-3f383cdc858b","Type":"ContainerDied","Data":"af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6"} Jan 27 11:23:28 crc kubenswrapper[4775]: E0127 11:23:28.631108 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-v5q62" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.517654 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.518301 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.638011 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4skp" event={"ID":"615aabb4-e21b-4941-ba5d-d6148cee87af","Type":"ContainerStarted","Data":"09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8"} Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.640870 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kd8m" event={"ID":"57a822f4-b93b-497d-bfc6-cf4f13cc8140","Type":"ContainerStarted","Data":"b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454"} Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.650826 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8snw" event={"ID":"2b487540-88bb-496a-9aff-3f383cdc858b","Type":"ContainerStarted","Data":"a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c"} Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.653338 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fchbb" event={"ID":"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3","Type":"ContainerStarted","Data":"88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3"} Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.661705 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t4skp" podStartSLOduration=2.936087174 podStartE2EDuration="33.661688759s" podCreationTimestamp="2026-01-27 11:22:56 +0000 UTC" firstStartedPulling="2026-01-27 11:22:58.287281341 +0000 UTC m=+157.428879118" lastFinishedPulling="2026-01-27 11:23:29.012882936 +0000 UTC m=+188.154480703" observedRunningTime="2026-01-27 11:23:29.661090503 +0000 UTC m=+188.802688280" watchObservedRunningTime="2026-01-27 11:23:29.661688759 +0000 UTC m=+188.803286536" Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.682439 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5kd8m" podStartSLOduration=2.840422261 podStartE2EDuration="35.682400832s" podCreationTimestamp="2026-01-27 11:22:54 +0000 UTC" firstStartedPulling="2026-01-27 11:22:56.248931256 +0000 UTC m=+155.390529033" lastFinishedPulling="2026-01-27 11:23:29.090909787 +0000 UTC m=+188.232507604" observedRunningTime="2026-01-27 11:23:29.677572881 +0000 UTC m=+188.819170668" watchObservedRunningTime="2026-01-27 11:23:29.682400832 +0000 UTC m=+188.823998619" Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.703078 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s8snw" podStartSLOduration=1.74425539 podStartE2EDuration="35.703056553s" podCreationTimestamp="2026-01-27 11:22:54 +0000 UTC" firstStartedPulling="2026-01-27 11:22:55.228579156 +0000 UTC m=+154.370176933" lastFinishedPulling="2026-01-27 11:23:29.187380279 +0000 UTC m=+188.328978096" observedRunningTime="2026-01-27 11:23:29.701142721 +0000 UTC m=+188.842740518" watchObservedRunningTime="2026-01-27 11:23:29.703056553 +0000 UTC m=+188.844654330" Jan 27 11:23:29 crc kubenswrapper[4775]: I0127 11:23:29.720104 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fchbb" podStartSLOduration=2.875851394 podStartE2EDuration="35.720080917s" podCreationTimestamp="2026-01-27 11:22:54 +0000 UTC" firstStartedPulling="2026-01-27 11:22:56.238937484 +0000 UTC m=+155.380535261" lastFinishedPulling="2026-01-27 11:23:29.083166997 +0000 UTC m=+188.224764784" observedRunningTime="2026-01-27 11:23:29.718797832 +0000 UTC m=+188.860395629" watchObservedRunningTime="2026-01-27 11:23:29.720080917 +0000 UTC m=+188.861678714" Jan 27 11:23:32 crc kubenswrapper[4775]: I0127 11:23:32.986441 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 11:23:34 crc kubenswrapper[4775]: I0127 11:23:34.646772 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:23:34 crc kubenswrapper[4775]: I0127 11:23:34.646826 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:23:34 crc kubenswrapper[4775]: I0127 11:23:34.796617 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:23:34 crc kubenswrapper[4775]: I0127 11:23:34.846897 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.059173 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.059218 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.096369 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.247939 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.248023 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.295181 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.446983 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 11:23:35 crc kubenswrapper[4775]: E0127 11:23:35.447224 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c496d7e-2613-433b-95bf-95257c8f2887" containerName="pruner" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.447237 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c496d7e-2613-433b-95bf-95257c8f2887" containerName="pruner" Jan 27 11:23:35 crc kubenswrapper[4775]: E0127 11:23:35.447259 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41348a87-6415-41fe-97a9-bcc552d7bc8e" containerName="pruner" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.447265 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="41348a87-6415-41fe-97a9-bcc552d7bc8e" containerName="pruner" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.447397 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c496d7e-2613-433b-95bf-95257c8f2887" containerName="pruner" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.447410 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="41348a87-6415-41fe-97a9-bcc552d7bc8e" containerName="pruner" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.447792 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.450540 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.451430 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.466581 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.533410 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71ab4c03-397b-4240-a3f7-c731b6b4331f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.533548 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71ab4c03-397b-4240-a3f7-c731b6b4331f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.635431 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71ab4c03-397b-4240-a3f7-c731b6b4331f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.635554 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71ab4c03-397b-4240-a3f7-c731b6b4331f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.635641 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71ab4c03-397b-4240-a3f7-c731b6b4331f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.663269 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71ab4c03-397b-4240-a3f7-c731b6b4331f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.759887 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.760448 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.775545 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jl5cc"] Jan 27 11:23:35 crc kubenswrapper[4775]: I0127 11:23:35.818557 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:36 crc kubenswrapper[4775]: I0127 11:23:36.319510 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 11:23:36 crc kubenswrapper[4775]: I0127 11:23:36.695732 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71ab4c03-397b-4240-a3f7-c731b6b4331f","Type":"ContainerStarted","Data":"80d5ed4ac57818277b923d597464c0dbabfd266d76aea35480dbb132b9aabf48"} Jan 27 11:23:36 crc kubenswrapper[4775]: I0127 11:23:36.696205 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71ab4c03-397b-4240-a3f7-c731b6b4331f","Type":"ContainerStarted","Data":"8bcadaabd27d0dc64ffd0d3ea07a0ea677bc056e052585f076381046957cde91"} Jan 27 11:23:36 crc kubenswrapper[4775]: I0127 11:23:36.699581 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkb7p" event={"ID":"f1ecb76d-1e7c-4889-ab6d-451e8b534308","Type":"ContainerStarted","Data":"0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045"} Jan 27 11:23:36 crc kubenswrapper[4775]: I0127 11:23:36.712654 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.71263967 podStartE2EDuration="1.71263967s" podCreationTimestamp="2026-01-27 11:23:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:23:36.71043325 +0000 UTC m=+195.852031027" watchObservedRunningTime="2026-01-27 11:23:36.71263967 +0000 UTC m=+195.854237447" Jan 27 11:23:36 crc kubenswrapper[4775]: I0127 11:23:36.924875 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5kd8m"] Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.255275 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.255327 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.295006 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.525598 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fchbb"] Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.708215 4775 generic.go:334] "Generic (PLEG): container finished" podID="71ab4c03-397b-4240-a3f7-c731b6b4331f" containerID="80d5ed4ac57818277b923d597464c0dbabfd266d76aea35480dbb132b9aabf48" exitCode=0 Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.708292 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71ab4c03-397b-4240-a3f7-c731b6b4331f","Type":"ContainerDied","Data":"80d5ed4ac57818277b923d597464c0dbabfd266d76aea35480dbb132b9aabf48"} Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.711774 4775 generic.go:334] "Generic (PLEG): container finished" podID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerID="0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045" exitCode=0 Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.711882 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkb7p" event={"ID":"f1ecb76d-1e7c-4889-ab6d-451e8b534308","Type":"ContainerDied","Data":"0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045"} Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.712632 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5kd8m" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="registry-server" containerID="cri-o://b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454" gracePeriod=2 Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.712965 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fchbb" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="registry-server" containerID="cri-o://88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3" gracePeriod=2 Jan 27 11:23:37 crc kubenswrapper[4775]: I0127 11:23:37.763590 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.193430 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.265685 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-utilities\") pod \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.266311 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-catalog-content\") pod \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.266425 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z4x5\" (UniqueName: \"kubernetes.io/projected/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-kube-api-access-6z4x5\") pod \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\" (UID: \"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3\") " Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.273417 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-kube-api-access-6z4x5" (OuterVolumeSpecName: "kube-api-access-6z4x5") pod "e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" (UID: "e4a37ccd-52c3-49cc-8db8-1f0069dee3c3"). InnerVolumeSpecName "kube-api-access-6z4x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.278329 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-utilities" (OuterVolumeSpecName: "utilities") pod "e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" (UID: "e4a37ccd-52c3-49cc-8db8-1f0069dee3c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.279407 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.327668 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" (UID: "e4a37ccd-52c3-49cc-8db8-1f0069dee3c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.367382 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfzg9\" (UniqueName: \"kubernetes.io/projected/57a822f4-b93b-497d-bfc6-cf4f13cc8140-kube-api-access-xfzg9\") pod \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.367532 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-utilities\") pod \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.367572 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-catalog-content\") pod \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\" (UID: \"57a822f4-b93b-497d-bfc6-cf4f13cc8140\") " Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.367844 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z4x5\" (UniqueName: \"kubernetes.io/projected/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-kube-api-access-6z4x5\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.367888 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.367899 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.368418 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-utilities" (OuterVolumeSpecName: "utilities") pod "57a822f4-b93b-497d-bfc6-cf4f13cc8140" (UID: "57a822f4-b93b-497d-bfc6-cf4f13cc8140"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.370596 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a822f4-b93b-497d-bfc6-cf4f13cc8140-kube-api-access-xfzg9" (OuterVolumeSpecName: "kube-api-access-xfzg9") pod "57a822f4-b93b-497d-bfc6-cf4f13cc8140" (UID: "57a822f4-b93b-497d-bfc6-cf4f13cc8140"). InnerVolumeSpecName "kube-api-access-xfzg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.414213 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a822f4-b93b-497d-bfc6-cf4f13cc8140" (UID: "57a822f4-b93b-497d-bfc6-cf4f13cc8140"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.468839 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfzg9\" (UniqueName: \"kubernetes.io/projected/57a822f4-b93b-497d-bfc6-cf4f13cc8140-kube-api-access-xfzg9\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.468877 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.468887 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a822f4-b93b-497d-bfc6-cf4f13cc8140-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.719104 4775 generic.go:334] "Generic (PLEG): container finished" podID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerID="b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454" exitCode=0 Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.719153 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kd8m" event={"ID":"57a822f4-b93b-497d-bfc6-cf4f13cc8140","Type":"ContainerDied","Data":"b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454"} Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.719190 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5kd8m" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.719216 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5kd8m" event={"ID":"57a822f4-b93b-497d-bfc6-cf4f13cc8140","Type":"ContainerDied","Data":"290297c00f0444d9d550e5200aba7133d2e252d2cc5cddaaa7f26158bf4b0fff"} Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.719239 4775 scope.go:117] "RemoveContainer" containerID="b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.721238 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkb7p" event={"ID":"f1ecb76d-1e7c-4889-ab6d-451e8b534308","Type":"ContainerStarted","Data":"2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1"} Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.728113 4775 generic.go:334] "Generic (PLEG): container finished" podID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerID="88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3" exitCode=0 Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.728152 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fchbb" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.728228 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fchbb" event={"ID":"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3","Type":"ContainerDied","Data":"88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3"} Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.728263 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fchbb" event={"ID":"e4a37ccd-52c3-49cc-8db8-1f0069dee3c3","Type":"ContainerDied","Data":"6ed253dee07a34146473ff3556fd2212e703a1c328ef425adac33fe2a7fe4fa8"} Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.741461 4775 scope.go:117] "RemoveContainer" containerID="a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.751642 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vkb7p" podStartSLOduration=2.817677543 podStartE2EDuration="44.751601492s" podCreationTimestamp="2026-01-27 11:22:54 +0000 UTC" firstStartedPulling="2026-01-27 11:22:56.256214444 +0000 UTC m=+155.397812211" lastFinishedPulling="2026-01-27 11:23:38.190138383 +0000 UTC m=+197.331736160" observedRunningTime="2026-01-27 11:23:38.748008942 +0000 UTC m=+197.889606719" watchObservedRunningTime="2026-01-27 11:23:38.751601492 +0000 UTC m=+197.893199269" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.766430 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5kd8m"] Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.773359 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5kd8m"] Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.783739 4775 scope.go:117] "RemoveContainer" containerID="e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.798991 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fchbb"] Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.805264 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fchbb"] Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.826003 4775 scope.go:117] "RemoveContainer" containerID="b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454" Jan 27 11:23:38 crc kubenswrapper[4775]: E0127 11:23:38.826467 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454\": container with ID starting with b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454 not found: ID does not exist" containerID="b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.826509 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454"} err="failed to get container status \"b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454\": rpc error: code = NotFound desc = could not find container \"b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454\": container with ID starting with b99bcf8ea162873255cdc33446bfaac108e8bbaa579341917a1183a0e446d454 not found: ID does not exist" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.826559 4775 scope.go:117] "RemoveContainer" containerID="a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575" Jan 27 11:23:38 crc kubenswrapper[4775]: E0127 11:23:38.826948 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575\": container with ID starting with a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575 not found: ID does not exist" containerID="a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.827020 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575"} err="failed to get container status \"a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575\": rpc error: code = NotFound desc = could not find container \"a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575\": container with ID starting with a8900d6b75918ae62ae0e0ccedd498bc8454622f061f8d348a35ee522962c575 not found: ID does not exist" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.827057 4775 scope.go:117] "RemoveContainer" containerID="e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371" Jan 27 11:23:38 crc kubenswrapper[4775]: E0127 11:23:38.827312 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371\": container with ID starting with e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371 not found: ID does not exist" containerID="e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.827342 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371"} err="failed to get container status \"e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371\": rpc error: code = NotFound desc = could not find container \"e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371\": container with ID starting with e13e4f3903b4dae4275e06acadd0eb064125b167cffabaa9a9aeb91907640371 not found: ID does not exist" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.827368 4775 scope.go:117] "RemoveContainer" containerID="88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.844546 4775 scope.go:117] "RemoveContainer" containerID="ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.874023 4775 scope.go:117] "RemoveContainer" containerID="13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.901376 4775 scope.go:117] "RemoveContainer" containerID="88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3" Jan 27 11:23:38 crc kubenswrapper[4775]: E0127 11:23:38.902860 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3\": container with ID starting with 88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3 not found: ID does not exist" containerID="88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.902891 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3"} err="failed to get container status \"88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3\": rpc error: code = NotFound desc = could not find container \"88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3\": container with ID starting with 88b74aa16addb492f86f91196aa4402919e2eb597d0b605badcce269879381f3 not found: ID does not exist" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.902926 4775 scope.go:117] "RemoveContainer" containerID="ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b" Jan 27 11:23:38 crc kubenswrapper[4775]: E0127 11:23:38.907706 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b\": container with ID starting with ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b not found: ID does not exist" containerID="ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.907755 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b"} err="failed to get container status \"ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b\": rpc error: code = NotFound desc = could not find container \"ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b\": container with ID starting with ff021fa899b71abe43c95b5cff98edfd985b80d7185978945a9fa200d971756b not found: ID does not exist" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.907785 4775 scope.go:117] "RemoveContainer" containerID="13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689" Jan 27 11:23:38 crc kubenswrapper[4775]: E0127 11:23:38.908263 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689\": container with ID starting with 13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689 not found: ID does not exist" containerID="13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689" Jan 27 11:23:38 crc kubenswrapper[4775]: I0127 11:23:38.908292 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689"} err="failed to get container status \"13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689\": rpc error: code = NotFound desc = could not find container \"13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689\": container with ID starting with 13bb9c95e7aa1cb8042200975c4142c633fbbfe7671cedeed1be8c4b863f7689 not found: ID does not exist" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.098496 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.177552 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71ab4c03-397b-4240-a3f7-c731b6b4331f-kube-api-access\") pod \"71ab4c03-397b-4240-a3f7-c731b6b4331f\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.177634 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71ab4c03-397b-4240-a3f7-c731b6b4331f-kubelet-dir\") pod \"71ab4c03-397b-4240-a3f7-c731b6b4331f\" (UID: \"71ab4c03-397b-4240-a3f7-c731b6b4331f\") " Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.177987 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71ab4c03-397b-4240-a3f7-c731b6b4331f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "71ab4c03-397b-4240-a3f7-c731b6b4331f" (UID: "71ab4c03-397b-4240-a3f7-c731b6b4331f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.181881 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71ab4c03-397b-4240-a3f7-c731b6b4331f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "71ab4c03-397b-4240-a3f7-c731b6b4331f" (UID: "71ab4c03-397b-4240-a3f7-c731b6b4331f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.279425 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/71ab4c03-397b-4240-a3f7-c731b6b4331f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.279499 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/71ab4c03-397b-4240-a3f7-c731b6b4331f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.735608 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"71ab4c03-397b-4240-a3f7-c731b6b4331f","Type":"ContainerDied","Data":"8bcadaabd27d0dc64ffd0d3ea07a0ea677bc056e052585f076381046957cde91"} Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.735644 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bcadaabd27d0dc64ffd0d3ea07a0ea677bc056e052585f076381046957cde91" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.735693 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.757190 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" path="/var/lib/kubelet/pods/57a822f4-b93b-497d-bfc6-cf4f13cc8140/volumes" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.757922 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" path="/var/lib/kubelet/pods/e4a37ccd-52c3-49cc-8db8-1f0069dee3c3/volumes" Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.927490 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4skp"] Jan 27 11:23:39 crc kubenswrapper[4775]: I0127 11:23:39.927745 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t4skp" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="registry-server" containerID="cri-o://09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8" gracePeriod=2 Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.416203 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.494133 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-catalog-content\") pod \"615aabb4-e21b-4941-ba5d-d6148cee87af\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.494218 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-utilities\") pod \"615aabb4-e21b-4941-ba5d-d6148cee87af\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.494281 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqkzd\" (UniqueName: \"kubernetes.io/projected/615aabb4-e21b-4941-ba5d-d6148cee87af-kube-api-access-fqkzd\") pod \"615aabb4-e21b-4941-ba5d-d6148cee87af\" (UID: \"615aabb4-e21b-4941-ba5d-d6148cee87af\") " Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.495130 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-utilities" (OuterVolumeSpecName: "utilities") pod "615aabb4-e21b-4941-ba5d-d6148cee87af" (UID: "615aabb4-e21b-4941-ba5d-d6148cee87af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.498895 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615aabb4-e21b-4941-ba5d-d6148cee87af-kube-api-access-fqkzd" (OuterVolumeSpecName: "kube-api-access-fqkzd") pod "615aabb4-e21b-4941-ba5d-d6148cee87af" (UID: "615aabb4-e21b-4941-ba5d-d6148cee87af"). InnerVolumeSpecName "kube-api-access-fqkzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.519322 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "615aabb4-e21b-4941-ba5d-d6148cee87af" (UID: "615aabb4-e21b-4941-ba5d-d6148cee87af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.595849 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.595883 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/615aabb4-e21b-4941-ba5d-d6148cee87af-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.595893 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqkzd\" (UniqueName: \"kubernetes.io/projected/615aabb4-e21b-4941-ba5d-d6148cee87af-kube-api-access-fqkzd\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.743077 4775 generic.go:334] "Generic (PLEG): container finished" podID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerID="09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8" exitCode=0 Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.743157 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4skp" event={"ID":"615aabb4-e21b-4941-ba5d-d6148cee87af","Type":"ContainerDied","Data":"09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8"} Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.743168 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t4skp" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.743190 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t4skp" event={"ID":"615aabb4-e21b-4941-ba5d-d6148cee87af","Type":"ContainerDied","Data":"67afb1f7244fe2812d1d4acb97266d9ec82321b9084603e7c3e8b7b7b66acb18"} Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.743209 4775 scope.go:117] "RemoveContainer" containerID="09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.756201 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw9xz" event={"ID":"232e2caf-d6b3-47b9-9ca0-45aec1e95045","Type":"ContainerStarted","Data":"45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154"} Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.773543 4775 scope.go:117] "RemoveContainer" containerID="2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.784151 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4skp"] Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.788048 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t4skp"] Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.796370 4775 scope.go:117] "RemoveContainer" containerID="15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.814827 4775 scope.go:117] "RemoveContainer" containerID="09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8" Jan 27 11:23:40 crc kubenswrapper[4775]: E0127 11:23:40.815337 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8\": container with ID starting with 09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8 not found: ID does not exist" containerID="09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.815365 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8"} err="failed to get container status \"09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8\": rpc error: code = NotFound desc = could not find container \"09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8\": container with ID starting with 09b2d0a13955b447395194b9f913700d4b7241eb36847162cd110b80db9a80b8 not found: ID does not exist" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.815388 4775 scope.go:117] "RemoveContainer" containerID="2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796" Jan 27 11:23:40 crc kubenswrapper[4775]: E0127 11:23:40.815886 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796\": container with ID starting with 2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796 not found: ID does not exist" containerID="2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.815938 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796"} err="failed to get container status \"2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796\": rpc error: code = NotFound desc = could not find container \"2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796\": container with ID starting with 2889d4e558106acdb8e1a5703fa7ee8364f743533f4ac7ff8267c873345d1796 not found: ID does not exist" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.815976 4775 scope.go:117] "RemoveContainer" containerID="15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae" Jan 27 11:23:40 crc kubenswrapper[4775]: E0127 11:23:40.816286 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae\": container with ID starting with 15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae not found: ID does not exist" containerID="15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae" Jan 27 11:23:40 crc kubenswrapper[4775]: I0127 11:23:40.816320 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae"} err="failed to get container status \"15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae\": rpc error: code = NotFound desc = could not find container \"15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae\": container with ID starting with 15c64747041c532d51659d238a856c7acc46ac0f7567e78ae41e0791b32fd5ae not found: ID does not exist" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.764140 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" path="/var/lib/kubelet/pods/615aabb4-e21b-4941-ba5d-d6148cee87af/volumes" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.769826 4775 generic.go:334] "Generic (PLEG): container finished" podID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerID="45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154" exitCode=0 Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.769965 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw9xz" event={"ID":"232e2caf-d6b3-47b9-9ca0-45aec1e95045","Type":"ContainerDied","Data":"45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154"} Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852301 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852842 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="extract-utilities" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852877 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="extract-utilities" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852895 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="extract-content" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852902 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="extract-content" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852910 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852916 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852924 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852930 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852939 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="extract-utilities" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852947 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="extract-utilities" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852956 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852962 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852968 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="extract-utilities" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852974 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="extract-utilities" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852985 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71ab4c03-397b-4240-a3f7-c731b6b4331f" containerName="pruner" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.852990 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="71ab4c03-397b-4240-a3f7-c731b6b4331f" containerName="pruner" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.852996 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="extract-content" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.853002 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="extract-content" Jan 27 11:23:41 crc kubenswrapper[4775]: E0127 11:23:41.853009 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="extract-content" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.853015 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="extract-content" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.853124 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="71ab4c03-397b-4240-a3f7-c731b6b4331f" containerName="pruner" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.853137 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="57a822f4-b93b-497d-bfc6-cf4f13cc8140" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.853151 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a37ccd-52c3-49cc-8db8-1f0069dee3c3" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.853161 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="615aabb4-e21b-4941-ba5d-d6148cee87af" containerName="registry-server" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.853686 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.856205 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.856396 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.857991 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.912932 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-var-lock\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.913025 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c9821b5-66df-49d6-a096-1494e7cdda93-kube-api-access\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:41 crc kubenswrapper[4775]: I0127 11:23:41.913169 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.015619 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-var-lock\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.015717 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-var-lock\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.015929 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c9821b5-66df-49d6-a096-1494e7cdda93-kube-api-access\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.015990 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.016090 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.043031 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c9821b5-66df-49d6-a096-1494e7cdda93-kube-api-access\") pod \"installer-9-crc\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.178167 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.697583 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 11:23:42 crc kubenswrapper[4775]: W0127 11:23:42.705362 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5c9821b5_66df_49d6_a096_1494e7cdda93.slice/crio-3e182f610e09fc96e35b67928b70b87e4088bb29390aa1755b8c17f64b88e80f WatchSource:0}: Error finding container 3e182f610e09fc96e35b67928b70b87e4088bb29390aa1755b8c17f64b88e80f: Status 404 returned error can't find the container with id 3e182f610e09fc96e35b67928b70b87e4088bb29390aa1755b8c17f64b88e80f Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.798048 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5c9821b5-66df-49d6-a096-1494e7cdda93","Type":"ContainerStarted","Data":"3e182f610e09fc96e35b67928b70b87e4088bb29390aa1755b8c17f64b88e80f"} Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.802937 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw9xz" event={"ID":"232e2caf-d6b3-47b9-9ca0-45aec1e95045","Type":"ContainerStarted","Data":"3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7"} Jan 27 11:23:42 crc kubenswrapper[4775]: I0127 11:23:42.818501 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lw9xz" podStartSLOduration=3.930616983 podStartE2EDuration="45.81848292s" podCreationTimestamp="2026-01-27 11:22:57 +0000 UTC" firstStartedPulling="2026-01-27 11:23:00.39935255 +0000 UTC m=+159.540950337" lastFinishedPulling="2026-01-27 11:23:42.287218497 +0000 UTC m=+201.428816274" observedRunningTime="2026-01-27 11:23:42.815170989 +0000 UTC m=+201.956768766" watchObservedRunningTime="2026-01-27 11:23:42.81848292 +0000 UTC m=+201.960080697" Jan 27 11:23:43 crc kubenswrapper[4775]: I0127 11:23:43.821913 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5c9821b5-66df-49d6-a096-1494e7cdda93","Type":"ContainerStarted","Data":"b0b65d278ea26de83de7ace1b46cf4ec14821a9e08e9d49fba53055de852bf72"} Jan 27 11:23:43 crc kubenswrapper[4775]: I0127 11:23:43.825170 4775 generic.go:334] "Generic (PLEG): container finished" podID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerID="5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79" exitCode=0 Jan 27 11:23:43 crc kubenswrapper[4775]: I0127 11:23:43.825233 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5q62" event={"ID":"3ae6a7af-e7d7-440b-b7cb-366edba2d44e","Type":"ContainerDied","Data":"5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79"} Jan 27 11:23:43 crc kubenswrapper[4775]: I0127 11:23:43.827658 4775 generic.go:334] "Generic (PLEG): container finished" podID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerID="6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042" exitCode=0 Jan 27 11:23:43 crc kubenswrapper[4775]: I0127 11:23:43.827725 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpqn9" event={"ID":"5415a9cc-8755-41e6-bd7b-1542339cadc6","Type":"ContainerDied","Data":"6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042"} Jan 27 11:23:43 crc kubenswrapper[4775]: I0127 11:23:43.846156 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.846136003 podStartE2EDuration="2.846136003s" podCreationTimestamp="2026-01-27 11:23:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:23:43.84277095 +0000 UTC m=+202.984368737" watchObservedRunningTime="2026-01-27 11:23:43.846136003 +0000 UTC m=+202.987733780" Jan 27 11:23:44 crc kubenswrapper[4775]: I0127 11:23:44.835017 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5q62" event={"ID":"3ae6a7af-e7d7-440b-b7cb-366edba2d44e","Type":"ContainerStarted","Data":"05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27"} Jan 27 11:23:44 crc kubenswrapper[4775]: I0127 11:23:44.838247 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpqn9" event={"ID":"5415a9cc-8755-41e6-bd7b-1542339cadc6","Type":"ContainerStarted","Data":"9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53"} Jan 27 11:23:44 crc kubenswrapper[4775]: I0127 11:23:44.841499 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:23:44 crc kubenswrapper[4775]: I0127 11:23:44.842280 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:23:44 crc kubenswrapper[4775]: I0127 11:23:44.860721 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v5q62" podStartSLOduration=2.876722269 podStartE2EDuration="47.860702886s" podCreationTimestamp="2026-01-27 11:22:57 +0000 UTC" firstStartedPulling="2026-01-27 11:22:59.309918423 +0000 UTC m=+158.451516200" lastFinishedPulling="2026-01-27 11:23:44.29389903 +0000 UTC m=+203.435496817" observedRunningTime="2026-01-27 11:23:44.857801377 +0000 UTC m=+203.999399164" watchObservedRunningTime="2026-01-27 11:23:44.860702886 +0000 UTC m=+204.002300663" Jan 27 11:23:44 crc kubenswrapper[4775]: I0127 11:23:44.872957 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wpqn9" podStartSLOduration=2.7067073649999998 podStartE2EDuration="48.872934809s" podCreationTimestamp="2026-01-27 11:22:56 +0000 UTC" firstStartedPulling="2026-01-27 11:22:58.287563889 +0000 UTC m=+157.429161666" lastFinishedPulling="2026-01-27 11:23:44.453791323 +0000 UTC m=+203.595389110" observedRunningTime="2026-01-27 11:23:44.870716591 +0000 UTC m=+204.012314368" watchObservedRunningTime="2026-01-27 11:23:44.872934809 +0000 UTC m=+204.014532586" Jan 27 11:23:44 crc kubenswrapper[4775]: I0127 11:23:44.895353 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:23:45 crc kubenswrapper[4775]: I0127 11:23:45.897897 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:23:46 crc kubenswrapper[4775]: I0127 11:23:46.854680 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:23:46 crc kubenswrapper[4775]: I0127 11:23:46.855045 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:23:46 crc kubenswrapper[4775]: I0127 11:23:46.907576 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:23:47 crc kubenswrapper[4775]: I0127 11:23:47.865910 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:23:47 crc kubenswrapper[4775]: I0127 11:23:47.866238 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:23:48 crc kubenswrapper[4775]: I0127 11:23:48.243302 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:23:48 crc kubenswrapper[4775]: I0127 11:23:48.243348 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:23:48 crc kubenswrapper[4775]: I0127 11:23:48.304576 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:23:48 crc kubenswrapper[4775]: I0127 11:23:48.902814 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v5q62" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="registry-server" probeResult="failure" output=< Jan 27 11:23:48 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 11:23:48 crc kubenswrapper[4775]: > Jan 27 11:23:48 crc kubenswrapper[4775]: I0127 11:23:48.919484 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.329177 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lw9xz"] Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.329919 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lw9xz" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="registry-server" containerID="cri-o://3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7" gracePeriod=2 Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.866827 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.884804 4775 generic.go:334] "Generic (PLEG): container finished" podID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerID="3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7" exitCode=0 Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.884850 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw9xz" event={"ID":"232e2caf-d6b3-47b9-9ca0-45aec1e95045","Type":"ContainerDied","Data":"3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7"} Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.884880 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lw9xz" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.884910 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lw9xz" event={"ID":"232e2caf-d6b3-47b9-9ca0-45aec1e95045","Type":"ContainerDied","Data":"03e04acda80c448d05ec4f5110391d0366fd0fc80d319a5ee3f1ee1c23fc4573"} Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.884932 4775 scope.go:117] "RemoveContainer" containerID="3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.911595 4775 scope.go:117] "RemoveContainer" containerID="45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.933871 4775 scope.go:117] "RemoveContainer" containerID="8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.949644 4775 scope.go:117] "RemoveContainer" containerID="3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7" Jan 27 11:23:51 crc kubenswrapper[4775]: E0127 11:23:51.955041 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7\": container with ID starting with 3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7 not found: ID does not exist" containerID="3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.955088 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7"} err="failed to get container status \"3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7\": rpc error: code = NotFound desc = could not find container \"3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7\": container with ID starting with 3bd6e0a7e9b37ac4cf4fb9fd5d826afacbe5ada9e52e74aceffbebd128d889d7 not found: ID does not exist" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.955122 4775 scope.go:117] "RemoveContainer" containerID="45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154" Jan 27 11:23:51 crc kubenswrapper[4775]: E0127 11:23:51.955752 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154\": container with ID starting with 45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154 not found: ID does not exist" containerID="45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.955788 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154"} err="failed to get container status \"45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154\": rpc error: code = NotFound desc = could not find container \"45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154\": container with ID starting with 45a5038b5e4d7616d0725ace56bb79bea0938cf67fb8433c8e617d912432b154 not found: ID does not exist" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.955807 4775 scope.go:117] "RemoveContainer" containerID="8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4" Jan 27 11:23:51 crc kubenswrapper[4775]: E0127 11:23:51.956198 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4\": container with ID starting with 8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4 not found: ID does not exist" containerID="8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4" Jan 27 11:23:51 crc kubenswrapper[4775]: I0127 11:23:51.956227 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4"} err="failed to get container status \"8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4\": rpc error: code = NotFound desc = could not find container \"8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4\": container with ID starting with 8c5fa263fe713b9e22e6ee4ddbc864ca974dedc201dfd884f5471ce3dc4597f4 not found: ID does not exist" Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.052045 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-catalog-content\") pod \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.052116 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-utilities\") pod \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.052164 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpls5\" (UniqueName: \"kubernetes.io/projected/232e2caf-d6b3-47b9-9ca0-45aec1e95045-kube-api-access-cpls5\") pod \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\" (UID: \"232e2caf-d6b3-47b9-9ca0-45aec1e95045\") " Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.053897 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-utilities" (OuterVolumeSpecName: "utilities") pod "232e2caf-d6b3-47b9-9ca0-45aec1e95045" (UID: "232e2caf-d6b3-47b9-9ca0-45aec1e95045"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.060252 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/232e2caf-d6b3-47b9-9ca0-45aec1e95045-kube-api-access-cpls5" (OuterVolumeSpecName: "kube-api-access-cpls5") pod "232e2caf-d6b3-47b9-9ca0-45aec1e95045" (UID: "232e2caf-d6b3-47b9-9ca0-45aec1e95045"). InnerVolumeSpecName "kube-api-access-cpls5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.154078 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.154134 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpls5\" (UniqueName: \"kubernetes.io/projected/232e2caf-d6b3-47b9-9ca0-45aec1e95045-kube-api-access-cpls5\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.216886 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "232e2caf-d6b3-47b9-9ca0-45aec1e95045" (UID: "232e2caf-d6b3-47b9-9ca0-45aec1e95045"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.255346 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/232e2caf-d6b3-47b9-9ca0-45aec1e95045-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.522696 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lw9xz"] Jan 27 11:23:52 crc kubenswrapper[4775]: I0127 11:23:52.530040 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lw9xz"] Jan 27 11:23:53 crc kubenswrapper[4775]: I0127 11:23:53.755142 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" path="/var/lib/kubelet/pods/232e2caf-d6b3-47b9-9ca0-45aec1e95045/volumes" Jan 27 11:23:56 crc kubenswrapper[4775]: I0127 11:23:56.921250 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:23:57 crc kubenswrapper[4775]: I0127 11:23:57.904658 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:23:57 crc kubenswrapper[4775]: I0127 11:23:57.964295 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.517744 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.517809 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.517861 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.518444 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.518542 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4" gracePeriod=600 Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.945065 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4" exitCode=0 Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.945135 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4"} Jan 27 11:23:59 crc kubenswrapper[4775]: I0127 11:23:59.945496 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"b93020ef7c9430606536756315c4ef1de229e2e6eaf460073cd42ad0825e59e8"} Jan 27 11:24:00 crc kubenswrapper[4775]: I0127 11:24:00.850595 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" podUID="27ef9f09-90fd-490f-a8b6-912a84eb05c5" containerName="oauth-openshift" containerID="cri-o://505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3" gracePeriod=15 Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.319902 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.354953 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-75566f9bd7-b29mm"] Jan 27 11:24:01 crc kubenswrapper[4775]: E0127 11:24:01.355278 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="extract-content" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.355306 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="extract-content" Jan 27 11:24:01 crc kubenswrapper[4775]: E0127 11:24:01.355329 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="extract-utilities" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.355342 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="extract-utilities" Jan 27 11:24:01 crc kubenswrapper[4775]: E0127 11:24:01.355374 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27ef9f09-90fd-490f-a8b6-912a84eb05c5" containerName="oauth-openshift" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.355387 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="27ef9f09-90fd-490f-a8b6-912a84eb05c5" containerName="oauth-openshift" Jan 27 11:24:01 crc kubenswrapper[4775]: E0127 11:24:01.355412 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="registry-server" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.355425 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="registry-server" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.355645 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="232e2caf-d6b3-47b9-9ca0-45aec1e95045" containerName="registry-server" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.355684 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="27ef9f09-90fd-490f-a8b6-912a84eb05c5" containerName="oauth-openshift" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.356256 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.379275 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75566f9bd7-b29mm"] Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489274 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-error\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489644 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-idp-0-file-data\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489673 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-serving-cert\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489735 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-provider-selection\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489765 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbwzf\" (UniqueName: \"kubernetes.io/projected/27ef9f09-90fd-490f-a8b6-912a84eb05c5-kube-api-access-vbwzf\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489815 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-dir\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489847 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-session\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489882 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-policies\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489912 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-cliconfig\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489940 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-service-ca\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.489977 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-ocp-branding-template\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.490018 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-login\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.490047 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-router-certs\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.490558 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-trusted-ca-bundle\") pod \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\" (UID: \"27ef9f09-90fd-490f-a8b6-912a84eb05c5\") " Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.490786 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491072 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491120 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-session\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491154 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-error\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491176 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c8218f8-a92b-41ec-bbc0-56ab92db9285-audit-dir\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491196 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-router-certs\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491217 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491241 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491267 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491292 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vktzf\" (UniqueName: \"kubernetes.io/projected/4c8218f8-a92b-41ec-bbc0-56ab92db9285-kube-api-access-vktzf\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491322 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-service-ca\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491347 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491334 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491379 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491370 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491535 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-login\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491799 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-audit-policies\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491825 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.491860 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.492037 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.492062 4775 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.492076 4775 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.492088 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.492100 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.496236 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.496628 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.496866 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.497215 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.498153 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.498173 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.498663 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.499567 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.501861 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27ef9f09-90fd-490f-a8b6-912a84eb05c5-kube-api-access-vbwzf" (OuterVolumeSpecName: "kube-api-access-vbwzf") pod "27ef9f09-90fd-490f-a8b6-912a84eb05c5" (UID: "27ef9f09-90fd-490f-a8b6-912a84eb05c5"). InnerVolumeSpecName "kube-api-access-vbwzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593152 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c8218f8-a92b-41ec-bbc0-56ab92db9285-audit-dir\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593191 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-router-certs\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593210 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593228 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-error\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593244 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593264 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593281 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vktzf\" (UniqueName: \"kubernetes.io/projected/4c8218f8-a92b-41ec-bbc0-56ab92db9285-kube-api-access-vktzf\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593300 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593315 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593330 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-service-ca\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593348 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-login\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593375 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-audit-policies\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593427 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593424 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c8218f8-a92b-41ec-bbc0-56ab92db9285-audit-dir\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.593472 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-session\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.594260 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.594385 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-service-ca\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.595163 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.595311 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-audit-policies\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596124 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596251 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596340 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596426 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596540 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596655 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbwzf\" (UniqueName: \"kubernetes.io/projected/27ef9f09-90fd-490f-a8b6-912a84eb05c5-kube-api-access-vbwzf\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596780 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596898 4775 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/27ef9f09-90fd-490f-a8b6-912a84eb05c5-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.596495 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.599531 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-session\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.599604 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.599596 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-error\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.599879 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.599995 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.602413 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-user-template-login\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.603403 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.604728 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4c8218f8-a92b-41ec-bbc0-56ab92db9285-v4-0-config-system-router-certs\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.612535 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vktzf\" (UniqueName: \"kubernetes.io/projected/4c8218f8-a92b-41ec-bbc0-56ab92db9285-kube-api-access-vktzf\") pod \"oauth-openshift-75566f9bd7-b29mm\" (UID: \"4c8218f8-a92b-41ec-bbc0-56ab92db9285\") " pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.680845 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.964867 4775 generic.go:334] "Generic (PLEG): container finished" podID="27ef9f09-90fd-490f-a8b6-912a84eb05c5" containerID="505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3" exitCode=0 Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.964934 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" event={"ID":"27ef9f09-90fd-490f-a8b6-912a84eb05c5","Type":"ContainerDied","Data":"505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3"} Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.964978 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" event={"ID":"27ef9f09-90fd-490f-a8b6-912a84eb05c5","Type":"ContainerDied","Data":"71f62b9e07cf144d54a44160698a6e892c6a6b7a96fbedaace452d7e78d81f2c"} Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.965006 4775 scope.go:117] "RemoveContainer" containerID="505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3" Jan 27 11:24:01 crc kubenswrapper[4775]: I0127 11:24:01.965290 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jl5cc" Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.001869 4775 scope.go:117] "RemoveContainer" containerID="505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3" Jan 27 11:24:02 crc kubenswrapper[4775]: E0127 11:24:02.003443 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3\": container with ID starting with 505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3 not found: ID does not exist" containerID="505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3" Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.003562 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3"} err="failed to get container status \"505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3\": rpc error: code = NotFound desc = could not find container \"505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3\": container with ID starting with 505fd8577d5a8e479f6b3dacb41a0b732d083d946cf2ca146294cb397508aca3 not found: ID does not exist" Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.019642 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jl5cc"] Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.023405 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jl5cc"] Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.151009 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75566f9bd7-b29mm"] Jan 27 11:24:02 crc kubenswrapper[4775]: W0127 11:24:02.157002 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c8218f8_a92b_41ec_bbc0_56ab92db9285.slice/crio-1852f953666a58b7472321977a7aca9e148ae526623aa7022fa13afae03f8073 WatchSource:0}: Error finding container 1852f953666a58b7472321977a7aca9e148ae526623aa7022fa13afae03f8073: Status 404 returned error can't find the container with id 1852f953666a58b7472321977a7aca9e148ae526623aa7022fa13afae03f8073 Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.980803 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" event={"ID":"4c8218f8-a92b-41ec-bbc0-56ab92db9285","Type":"ContainerStarted","Data":"e992f4f62fbcfcda8d01ff36a4a42daec58c7a0934049b58e4704240fe47eb7b"} Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.980853 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" event={"ID":"4c8218f8-a92b-41ec-bbc0-56ab92db9285","Type":"ContainerStarted","Data":"1852f953666a58b7472321977a7aca9e148ae526623aa7022fa13afae03f8073"} Jan 27 11:24:02 crc kubenswrapper[4775]: I0127 11:24:02.981108 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:03 crc kubenswrapper[4775]: I0127 11:24:03.003222 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" podStartSLOduration=28.003202375 podStartE2EDuration="28.003202375s" podCreationTimestamp="2026-01-27 11:23:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:24:03.002307157 +0000 UTC m=+222.143904954" watchObservedRunningTime="2026-01-27 11:24:03.003202375 +0000 UTC m=+222.144800162" Jan 27 11:24:03 crc kubenswrapper[4775]: I0127 11:24:03.019346 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-75566f9bd7-b29mm" Jan 27 11:24:03 crc kubenswrapper[4775]: I0127 11:24:03.753682 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27ef9f09-90fd-490f-a8b6-912a84eb05c5" path="/var/lib/kubelet/pods/27ef9f09-90fd-490f-a8b6-912a84eb05c5/volumes" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.637043 4775 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.638586 4775 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.638716 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639138 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629" gracePeriod=15 Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639175 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699" gracePeriod=15 Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639201 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89" gracePeriod=15 Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639173 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c" gracePeriod=15 Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639247 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57" gracePeriod=15 Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639420 4775 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639663 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639684 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639700 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639710 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639724 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639732 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639744 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639751 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639761 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639768 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639781 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639788 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639800 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639807 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 27 11:24:20 crc kubenswrapper[4775]: E0127 11:24:20.639817 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639824 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639941 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639956 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639970 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639979 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.639990 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.640001 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.640012 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704082 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704473 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704500 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704549 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704585 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704632 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704655 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.704677 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.742218 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.805615 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.805681 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.805729 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.805746 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.805772 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.805793 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.806319 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.806911 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.806977 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.807014 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.807047 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.807168 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.807206 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.807228 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.807774 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:20 crc kubenswrapper[4775]: I0127 11:24:20.807805 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.032190 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:24:21 crc kubenswrapper[4775]: E0127 11:24:21.059122 4775 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188e92c0f5c582b4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 11:24:21.057921716 +0000 UTC m=+240.199519533,LastTimestamp:2026-01-27 11:24:21.057921716 +0000 UTC m=+240.199519533,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.101489 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7f62667413fd238f0fa32d4f5ea8db5ada528df2641989942fb36daae3dce93c"} Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.103440 4775 generic.go:334] "Generic (PLEG): container finished" podID="5c9821b5-66df-49d6-a096-1494e7cdda93" containerID="b0b65d278ea26de83de7ace1b46cf4ec14821a9e08e9d49fba53055de852bf72" exitCode=0 Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.103548 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5c9821b5-66df-49d6-a096-1494e7cdda93","Type":"ContainerDied","Data":"b0b65d278ea26de83de7ace1b46cf4ec14821a9e08e9d49fba53055de852bf72"} Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.105035 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.108739 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.109673 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.110469 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.112061 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.112979 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c" exitCode=0 Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.113016 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89" exitCode=0 Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.113029 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699" exitCode=0 Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.113037 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57" exitCode=2 Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.113085 4775 scope.go:117] "RemoveContainer" containerID="96a845e2a7eb764213aadd5ec14ffc332d630525097be7886dd085c3c6ca5f34" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.760308 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.761114 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:21 crc kubenswrapper[4775]: I0127 11:24:21.761575 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.127655 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.132604 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d"} Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.133629 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.134172 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.384365 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.384997 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.385343 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.428974 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c9821b5-66df-49d6-a096-1494e7cdda93-kube-api-access\") pod \"5c9821b5-66df-49d6-a096-1494e7cdda93\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.429009 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-kubelet-dir\") pod \"5c9821b5-66df-49d6-a096-1494e7cdda93\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.429070 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-var-lock\") pod \"5c9821b5-66df-49d6-a096-1494e7cdda93\" (UID: \"5c9821b5-66df-49d6-a096-1494e7cdda93\") " Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.429199 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5c9821b5-66df-49d6-a096-1494e7cdda93" (UID: "5c9821b5-66df-49d6-a096-1494e7cdda93"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.429266 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-var-lock" (OuterVolumeSpecName: "var-lock") pod "5c9821b5-66df-49d6-a096-1494e7cdda93" (UID: "5c9821b5-66df-49d6-a096-1494e7cdda93"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.429513 4775 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-var-lock\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.429535 4775 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c9821b5-66df-49d6-a096-1494e7cdda93-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.434253 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c9821b5-66df-49d6-a096-1494e7cdda93-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5c9821b5-66df-49d6-a096-1494e7cdda93" (UID: "5c9821b5-66df-49d6-a096-1494e7cdda93"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:24:22 crc kubenswrapper[4775]: I0127 11:24:22.531477 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5c9821b5-66df-49d6-a096-1494e7cdda93-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.104203 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.105005 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.105661 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.106127 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.106600 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.137931 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5c9821b5-66df-49d6-a096-1494e7cdda93","Type":"ContainerDied","Data":"3e182f610e09fc96e35b67928b70b87e4088bb29390aa1755b8c17f64b88e80f"} Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.137954 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.137964 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e182f610e09fc96e35b67928b70b87e4088bb29390aa1755b8c17f64b88e80f" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.140280 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.141284 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629" exitCode=0 Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.141474 4775 scope.go:117] "RemoveContainer" containerID="fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.141865 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.150727 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.151139 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.151492 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.157299 4775 scope.go:117] "RemoveContainer" containerID="f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.172265 4775 scope.go:117] "RemoveContainer" containerID="80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.187104 4775 scope.go:117] "RemoveContainer" containerID="ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.199779 4775 scope.go:117] "RemoveContainer" containerID="169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.213701 4775 scope.go:117] "RemoveContainer" containerID="55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.230371 4775 scope.go:117] "RemoveContainer" containerID="fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c" Jan 27 11:24:23 crc kubenswrapper[4775]: E0127 11:24:23.231478 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\": container with ID starting with fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c not found: ID does not exist" containerID="fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.231522 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c"} err="failed to get container status \"fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\": rpc error: code = NotFound desc = could not find container \"fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c\": container with ID starting with fbd342acc8ef03378a5bd021da893307951b5ea605744361f6eab6b578c83d3c not found: ID does not exist" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.231552 4775 scope.go:117] "RemoveContainer" containerID="f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89" Jan 27 11:24:23 crc kubenswrapper[4775]: E0127 11:24:23.231872 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\": container with ID starting with f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89 not found: ID does not exist" containerID="f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.231937 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89"} err="failed to get container status \"f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\": rpc error: code = NotFound desc = could not find container \"f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89\": container with ID starting with f755b5d518addef61fa1888fb8d91d5fa6aa3bffa57f5f038df7488a1d079d89 not found: ID does not exist" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.231983 4775 scope.go:117] "RemoveContainer" containerID="80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699" Jan 27 11:24:23 crc kubenswrapper[4775]: E0127 11:24:23.232331 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\": container with ID starting with 80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699 not found: ID does not exist" containerID="80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.232361 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699"} err="failed to get container status \"80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\": rpc error: code = NotFound desc = could not find container \"80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699\": container with ID starting with 80091517e42c5779f98efda77379e6a3239169bf473fb6459ff3781c9d7d0699 not found: ID does not exist" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.232380 4775 scope.go:117] "RemoveContainer" containerID="ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57" Jan 27 11:24:23 crc kubenswrapper[4775]: E0127 11:24:23.232771 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\": container with ID starting with ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57 not found: ID does not exist" containerID="ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.232818 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57"} err="failed to get container status \"ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\": rpc error: code = NotFound desc = could not find container \"ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57\": container with ID starting with ee4c9e95d6adf790bb4eae3d6533589a789590abc6d02c06836454ce618ede57 not found: ID does not exist" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.232850 4775 scope.go:117] "RemoveContainer" containerID="169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629" Jan 27 11:24:23 crc kubenswrapper[4775]: E0127 11:24:23.233360 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\": container with ID starting with 169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629 not found: ID does not exist" containerID="169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.233388 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629"} err="failed to get container status \"169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\": rpc error: code = NotFound desc = could not find container \"169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629\": container with ID starting with 169bbaebefb600441feb45a1490c6e4cebf6d5b5bdb9a731a983fb6fbce4e629 not found: ID does not exist" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.233406 4775 scope.go:117] "RemoveContainer" containerID="55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685" Jan 27 11:24:23 crc kubenswrapper[4775]: E0127 11:24:23.233718 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\": container with ID starting with 55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685 not found: ID does not exist" containerID="55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.233765 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685"} err="failed to get container status \"55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\": rpc error: code = NotFound desc = could not find container \"55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685\": container with ID starting with 55928f9d0181532bb9499b1b5f0bbeaa0170089d38218c21f323d8ceb514f685 not found: ID does not exist" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240208 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240280 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240333 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240358 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240425 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240563 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240612 4775 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.240629 4775 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.342099 4775 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.455333 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.455818 4775 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.456165 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:23 crc kubenswrapper[4775]: I0127 11:24:23.752308 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.138763 4775 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.22:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188e92c0f5c582b4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 11:24:21.057921716 +0000 UTC m=+240.199519533,LastTimestamp:2026-01-27 11:24:21.057921716 +0000 UTC m=+240.199519533,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.163671 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.164338 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.164776 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.165431 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.166174 4775 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:29 crc kubenswrapper[4775]: I0127 11:24:29.166224 4775 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.166661 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="200ms" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.367345 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="400ms" Jan 27 11:24:29 crc kubenswrapper[4775]: E0127 11:24:29.768803 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="800ms" Jan 27 11:24:30 crc kubenswrapper[4775]: E0127 11:24:30.570480 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="1.6s" Jan 27 11:24:31 crc kubenswrapper[4775]: I0127 11:24:31.749379 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:31 crc kubenswrapper[4775]: I0127 11:24:31.750828 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:32 crc kubenswrapper[4775]: E0127 11:24:32.173496 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="3.2s" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.217146 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.217966 4775 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058" exitCode=1 Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.218011 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058"} Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.218886 4775 scope.go:117] "RemoveContainer" containerID="e2fc7ccf0bd3de5afa6fc20c293437f4e55dfa5c745c19b1bb9d937133589058" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.219428 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.220772 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.222916 4775 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:35 crc kubenswrapper[4775]: E0127 11:24:35.375261 4775 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.22:6443: connect: connection refused" interval="6.4s" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.744677 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.745637 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.746200 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.746813 4775 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.776748 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.776795 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:35 crc kubenswrapper[4775]: E0127 11:24:35.777894 4775 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:35 crc kubenswrapper[4775]: I0127 11:24:35.778684 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:35 crc kubenswrapper[4775]: W0127 11:24:35.806586 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-58c3d54301c33cc5876c77f2cf966c89f35e2471fc11def3283e8f703120667f WatchSource:0}: Error finding container 58c3d54301c33cc5876c77f2cf966c89f35e2471fc11def3283e8f703120667f: Status 404 returned error can't find the container with id 58c3d54301c33cc5876c77f2cf966c89f35e2471fc11def3283e8f703120667f Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.025734 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.225003 4775 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="4329eb3a61399ce0eceab91ddd8193e207cb20c01a496f40e5dd919acf58610d" exitCode=0 Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.225068 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"4329eb3a61399ce0eceab91ddd8193e207cb20c01a496f40e5dd919acf58610d"} Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.225095 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"58c3d54301c33cc5876c77f2cf966c89f35e2471fc11def3283e8f703120667f"} Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.225370 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.225383 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.225785 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:36 crc kubenswrapper[4775]: E0127 11:24:36.225927 4775 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.226049 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.226298 4775 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.229873 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.229934 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ef5a81e518ca785a434d6b7a0dee3b7169508b0e02080e4d4bd936956d71c34d"} Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.230839 4775 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.231304 4775 status_manager.go:851] "Failed to get status for pod" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:36 crc kubenswrapper[4775]: I0127 11:24:36.231681 4775 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.22:6443: connect: connection refused" Jan 27 11:24:37 crc kubenswrapper[4775]: I0127 11:24:37.242697 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6d1d7fce1e44af4af008b17532db5642d39fc1175df717b330e89ccd272c0655"} Jan 27 11:24:37 crc kubenswrapper[4775]: I0127 11:24:37.243068 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f341f77111d8af986366f6a41d8857f5fed9bde322762eeb2c45dcdac876be8a"} Jan 27 11:24:37 crc kubenswrapper[4775]: I0127 11:24:37.243088 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d4be0d78c406c2a2f3155a654c43c68c680ad827ab6c7b59733fb13fb1d1d197"} Jan 27 11:24:37 crc kubenswrapper[4775]: I0127 11:24:37.243102 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"df4fb11c5c8d9f1acf57252492a6144675a36c93322cbcf4ffc12aef2c80a277"} Jan 27 11:24:37 crc kubenswrapper[4775]: I0127 11:24:37.753005 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:24:37 crc kubenswrapper[4775]: I0127 11:24:37.757662 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:24:38 crc kubenswrapper[4775]: I0127 11:24:38.250328 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"247b34de559934811170768cc750edf0379f5cbb22b9a8a6f0721defd0aa3dc1"} Jan 27 11:24:38 crc kubenswrapper[4775]: I0127 11:24:38.251050 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:24:38 crc kubenswrapper[4775]: I0127 11:24:38.250669 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:38 crc kubenswrapper[4775]: I0127 11:24:38.251238 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:40 crc kubenswrapper[4775]: I0127 11:24:40.780122 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:40 crc kubenswrapper[4775]: I0127 11:24:40.780523 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:40 crc kubenswrapper[4775]: I0127 11:24:40.789398 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:43 crc kubenswrapper[4775]: I0127 11:24:43.262861 4775 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:44 crc kubenswrapper[4775]: I0127 11:24:44.285275 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:44 crc kubenswrapper[4775]: I0127 11:24:44.285292 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:44 crc kubenswrapper[4775]: I0127 11:24:44.285609 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:44 crc kubenswrapper[4775]: I0127 11:24:44.288853 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:24:44 crc kubenswrapper[4775]: I0127 11:24:44.290734 4775 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e34a4318-042c-4b5f-8f23-f4d269294fe1" Jan 27 11:24:45 crc kubenswrapper[4775]: I0127 11:24:45.289831 4775 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:45 crc kubenswrapper[4775]: I0127 11:24:45.289869 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f2ad5463-900c-4c6a-b8f8-4961abf97877" Jan 27 11:24:46 crc kubenswrapper[4775]: I0127 11:24:46.029295 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 11:24:51 crc kubenswrapper[4775]: I0127 11:24:51.777142 4775 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e34a4318-042c-4b5f-8f23-f4d269294fe1" Jan 27 11:24:52 crc kubenswrapper[4775]: I0127 11:24:52.186820 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 27 11:24:53 crc kubenswrapper[4775]: I0127 11:24:53.136144 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 27 11:24:53 crc kubenswrapper[4775]: I0127 11:24:53.283123 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 27 11:24:53 crc kubenswrapper[4775]: I0127 11:24:53.885555 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.005825 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.619805 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.622116 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.704064 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.708364 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.849668 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.859936 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.902923 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 11:24:54 crc kubenswrapper[4775]: I0127 11:24:54.933687 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.024552 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.099527 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.141689 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.143419 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.191434 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.681200 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.734307 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.787685 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 27 11:24:55 crc kubenswrapper[4775]: I0127 11:24:55.994895 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.074349 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.078947 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.280651 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.290905 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.320559 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.386350 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.388131 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.403505 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.432753 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.479938 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.484934 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.505296 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.600184 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.684645 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.698109 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.700599 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.720833 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.830864 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 27 11:24:56 crc kubenswrapper[4775]: I0127 11:24:56.976679 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.008425 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.110530 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.190211 4775 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.315719 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.363918 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.442708 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.523487 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.560310 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.627026 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.652870 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.654388 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.661923 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.693046 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.699705 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.842692 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 11:24:57 crc kubenswrapper[4775]: I0127 11:24:57.866652 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.148703 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.185139 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.253932 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.291896 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.377813 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.380161 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.417546 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.449809 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.529371 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.591549 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.605198 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.606839 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.619809 4775 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.695604 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.744586 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.797700 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.881862 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 27 11:24:58 crc kubenswrapper[4775]: I0127 11:24:58.940533 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.007885 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.075895 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.143162 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.327834 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.408350 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.418341 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.534488 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.586703 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.657694 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.659726 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.676210 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.760137 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.760138 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.767926 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.770052 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.909329 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 27 11:24:59 crc kubenswrapper[4775]: I0127 11:24:59.931245 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.046032 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.047338 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.090901 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.129941 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.144309 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.148487 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.151224 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.153369 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.165311 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.167234 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.310343 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.361332 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.366714 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.451559 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.469153 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.507528 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.570109 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.574658 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 27 11:25:00 crc kubenswrapper[4775]: I0127 11:25:00.677292 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:00.798636 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:00.818136 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:00.972909 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.274048 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.312902 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.342468 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.356546 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.452718 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.507540 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.553170 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.609233 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.669300 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.689662 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.719914 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.755263 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.764641 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.784849 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.793679 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.852241 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.924302 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 11:25:01 crc kubenswrapper[4775]: I0127 11:25:01.942973 4775 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.184982 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.195259 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.210145 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.269295 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.313419 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.402845 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.412068 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.532311 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.578943 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.605854 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.631262 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.695937 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.747063 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.786368 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.893157 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.943282 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 27 11:25:02 crc kubenswrapper[4775]: I0127 11:25:02.965389 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.051092 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.077889 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.136341 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.194498 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.197503 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.265753 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.381646 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.444909 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.479114 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.488909 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.490636 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.665604 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.678109 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.692313 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.716756 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.752641 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.782446 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.814905 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.859219 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.918993 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 27 11:25:03 crc kubenswrapper[4775]: I0127 11:25:03.972264 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.020223 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.041915 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.047739 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.245679 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.353786 4775 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.361923 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=44.361892127 podStartE2EDuration="44.361892127s" podCreationTimestamp="2026-01-27 11:24:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:24:42.986922663 +0000 UTC m=+262.128520450" watchObservedRunningTime="2026-01-27 11:25:04.361892127 +0000 UTC m=+283.503489944" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.363841 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.363966 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.364017 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpqn9","openshift-marketplace/community-operators-vkb7p","openshift-marketplace/redhat-operators-v5q62","openshift-marketplace/marketplace-operator-79b997595-krl46","openshift-marketplace/certified-operators-s8snw"] Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.364567 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s8snw" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="registry-server" containerID="cri-o://a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" gracePeriod=30 Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.365214 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wpqn9" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="registry-server" containerID="cri-o://9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53" gracePeriod=30 Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.365312 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" podUID="68158dce-8840-47f8-8dac-37abc28edc74" containerName="marketplace-operator" containerID="cri-o://0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c" gracePeriod=30 Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.365556 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vkb7p" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="registry-server" containerID="cri-o://2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" gracePeriod=30 Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.365714 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v5q62" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="registry-server" containerID="cri-o://05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27" gracePeriod=30 Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.371944 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qxmcq"] Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.372534 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" containerName="installer" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.372560 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" containerName="installer" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.372774 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c9821b5-66df-49d6-a096-1494e7cdda93" containerName="installer" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.373708 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.375822 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.402319 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qxmcq"] Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.455921 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=21.455902256 podStartE2EDuration="21.455902256s" podCreationTimestamp="2026-01-27 11:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:25:04.453950687 +0000 UTC m=+283.595548504" watchObservedRunningTime="2026-01-27 11:25:04.455902256 +0000 UTC m=+283.597500033" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.491901 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc92bcc5-aeca-4736-b861-e6f1540a15d1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.491957 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc92bcc5-aeca-4736-b861-e6f1540a15d1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.492065 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgrf9\" (UniqueName: \"kubernetes.io/projected/fc92bcc5-aeca-4736-b861-e6f1540a15d1-kube-api-access-qgrf9\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.509816 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68158dce_8840_47f8_8dac_37abc28edc74.slice/crio-0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5415a9cc_8755_41e6_bd7b_1542339cadc6.slice/crio-9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5415a9cc_8755_41e6_bd7b_1542339cadc6.slice/crio-conmon-9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53.scope\": RecentStats: unable to find data in memory cache]" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.569766 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.579355 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.595067 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgrf9\" (UniqueName: \"kubernetes.io/projected/fc92bcc5-aeca-4736-b861-e6f1540a15d1-kube-api-access-qgrf9\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.595213 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc92bcc5-aeca-4736-b861-e6f1540a15d1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.597274 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc92bcc5-aeca-4736-b861-e6f1540a15d1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.597365 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc92bcc5-aeca-4736-b861-e6f1540a15d1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.606644 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc92bcc5-aeca-4736-b861-e6f1540a15d1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.612233 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgrf9\" (UniqueName: \"kubernetes.io/projected/fc92bcc5-aeca-4736-b861-e6f1540a15d1-kube-api-access-qgrf9\") pod \"marketplace-operator-79b997595-qxmcq\" (UID: \"fc92bcc5-aeca-4736-b861-e6f1540a15d1\") " pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.618357 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.647674 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c is running failed: container process not found" containerID="a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.647971 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c is running failed: container process not found" containerID="a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.648189 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c is running failed: container process not found" containerID="a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.648242 4775 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/certified-operators-s8snw" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="registry-server" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.667187 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.710097 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.718296 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.748678 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.772791 4775 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.781858 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.792176 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.799864 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.841014 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.843921 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1 is running failed: container process not found" containerID="2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.845396 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1 is running failed: container process not found" containerID="2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.846013 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1 is running failed: container process not found" containerID="2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 11:25:04 crc kubenswrapper[4775]: E0127 11:25:04.846093 4775 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-vkb7p" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="registry-server" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.851750 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.870334 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.882331 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.897774 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.902508 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.911894 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:25:04 crc kubenswrapper[4775]: I0127 11:25:04.952930 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.004240 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-catalog-content\") pod \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.004304 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-catalog-content\") pod \"5415a9cc-8755-41e6-bd7b-1542339cadc6\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.004334 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7htj4\" (UniqueName: \"kubernetes.io/projected/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-kube-api-access-7htj4\") pod \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.004363 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmb78\" (UniqueName: \"kubernetes.io/projected/5415a9cc-8755-41e6-bd7b-1542339cadc6-kube-api-access-nmb78\") pod \"5415a9cc-8755-41e6-bd7b-1542339cadc6\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.004392 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-utilities\") pod \"5415a9cc-8755-41e6-bd7b-1542339cadc6\" (UID: \"5415a9cc-8755-41e6-bd7b-1542339cadc6\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.004424 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-trusted-ca\") pod \"68158dce-8840-47f8-8dac-37abc28edc74\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.005293 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "68158dce-8840-47f8-8dac-37abc28edc74" (UID: "68158dce-8840-47f8-8dac-37abc28edc74"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.005342 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-operator-metrics\") pod \"68158dce-8840-47f8-8dac-37abc28edc74\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.005402 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-utilities\") pod \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.005549 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-utilities" (OuterVolumeSpecName: "utilities") pod "5415a9cc-8755-41e6-bd7b-1542339cadc6" (UID: "5415a9cc-8755-41e6-bd7b-1542339cadc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.006327 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-utilities" (OuterVolumeSpecName: "utilities") pod "f1ecb76d-1e7c-4889-ab6d-451e8b534308" (UID: "f1ecb76d-1e7c-4889-ab6d-451e8b534308"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.005437 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4q8jd\" (UniqueName: \"kubernetes.io/projected/68158dce-8840-47f8-8dac-37abc28edc74-kube-api-access-4q8jd\") pod \"68158dce-8840-47f8-8dac-37abc28edc74\" (UID: \"68158dce-8840-47f8-8dac-37abc28edc74\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.006407 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-catalog-content\") pod \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.006431 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7qsks\" (UniqueName: \"kubernetes.io/projected/f1ecb76d-1e7c-4889-ab6d-451e8b534308-kube-api-access-7qsks\") pod \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\" (UID: \"f1ecb76d-1e7c-4889-ab6d-451e8b534308\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.008481 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5415a9cc-8755-41e6-bd7b-1542339cadc6-kube-api-access-nmb78" (OuterVolumeSpecName: "kube-api-access-nmb78") pod "5415a9cc-8755-41e6-bd7b-1542339cadc6" (UID: "5415a9cc-8755-41e6-bd7b-1542339cadc6"). InnerVolumeSpecName "kube-api-access-nmb78". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.009314 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "68158dce-8840-47f8-8dac-37abc28edc74" (UID: "68158dce-8840-47f8-8dac-37abc28edc74"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.009782 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68158dce-8840-47f8-8dac-37abc28edc74-kube-api-access-4q8jd" (OuterVolumeSpecName: "kube-api-access-4q8jd") pod "68158dce-8840-47f8-8dac-37abc28edc74" (UID: "68158dce-8840-47f8-8dac-37abc28edc74"). InnerVolumeSpecName "kube-api-access-4q8jd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.010500 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-kube-api-access-7htj4" (OuterVolumeSpecName: "kube-api-access-7htj4") pod "3ae6a7af-e7d7-440b-b7cb-366edba2d44e" (UID: "3ae6a7af-e7d7-440b-b7cb-366edba2d44e"). InnerVolumeSpecName "kube-api-access-7htj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.010582 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1ecb76d-1e7c-4889-ab6d-451e8b534308-kube-api-access-7qsks" (OuterVolumeSpecName: "kube-api-access-7qsks") pod "f1ecb76d-1e7c-4889-ab6d-451e8b534308" (UID: "f1ecb76d-1e7c-4889-ab6d-451e8b534308"). InnerVolumeSpecName "kube-api-access-7qsks". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.014221 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-962qq\" (UniqueName: \"kubernetes.io/projected/2b487540-88bb-496a-9aff-3f383cdc858b-kube-api-access-962qq\") pod \"2b487540-88bb-496a-9aff-3f383cdc858b\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.017332 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b487540-88bb-496a-9aff-3f383cdc858b-kube-api-access-962qq" (OuterVolumeSpecName: "kube-api-access-962qq") pod "2b487540-88bb-496a-9aff-3f383cdc858b" (UID: "2b487540-88bb-496a-9aff-3f383cdc858b"). InnerVolumeSpecName "kube-api-access-962qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018477 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-utilities\") pod \"2b487540-88bb-496a-9aff-3f383cdc858b\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018529 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-utilities\") pod \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\" (UID: \"3ae6a7af-e7d7-440b-b7cb-366edba2d44e\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018557 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-catalog-content\") pod \"2b487540-88bb-496a-9aff-3f383cdc858b\" (UID: \"2b487540-88bb-496a-9aff-3f383cdc858b\") " Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018863 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7htj4\" (UniqueName: \"kubernetes.io/projected/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-kube-api-access-7htj4\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018881 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmb78\" (UniqueName: \"kubernetes.io/projected/5415a9cc-8755-41e6-bd7b-1542339cadc6-kube-api-access-nmb78\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018891 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018900 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018909 4775 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/68158dce-8840-47f8-8dac-37abc28edc74-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018919 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018927 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4q8jd\" (UniqueName: \"kubernetes.io/projected/68158dce-8840-47f8-8dac-37abc28edc74-kube-api-access-4q8jd\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018936 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7qsks\" (UniqueName: \"kubernetes.io/projected/f1ecb76d-1e7c-4889-ab6d-451e8b534308-kube-api-access-7qsks\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.018944 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-962qq\" (UniqueName: \"kubernetes.io/projected/2b487540-88bb-496a-9aff-3f383cdc858b-kube-api-access-962qq\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.020189 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-utilities" (OuterVolumeSpecName: "utilities") pod "3ae6a7af-e7d7-440b-b7cb-366edba2d44e" (UID: "3ae6a7af-e7d7-440b-b7cb-366edba2d44e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.021119 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-utilities" (OuterVolumeSpecName: "utilities") pod "2b487540-88bb-496a-9aff-3f383cdc858b" (UID: "2b487540-88bb-496a-9aff-3f383cdc858b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.029305 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5415a9cc-8755-41e6-bd7b-1542339cadc6" (UID: "5415a9cc-8755-41e6-bd7b-1542339cadc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.038129 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.043870 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.049555 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.067542 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1ecb76d-1e7c-4889-ab6d-451e8b534308" (UID: "f1ecb76d-1e7c-4889-ab6d-451e8b534308"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.078245 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b487540-88bb-496a-9aff-3f383cdc858b" (UID: "2b487540-88bb-496a-9aff-3f383cdc858b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.119858 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.119898 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.119914 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1ecb76d-1e7c-4889-ab6d-451e8b534308-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.119929 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5415a9cc-8755-41e6-bd7b-1542339cadc6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.119941 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b487540-88bb-496a-9aff-3f383cdc858b-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.130162 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.131739 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ae6a7af-e7d7-440b-b7cb-366edba2d44e" (UID: "3ae6a7af-e7d7-440b-b7cb-366edba2d44e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.182130 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.196710 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qxmcq"] Jan 27 11:25:05 crc kubenswrapper[4775]: W0127 11:25:05.206929 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc92bcc5_aeca_4736_b861_e6f1540a15d1.slice/crio-f81231987c58c930d579cee292107a078099c24faa5841e85fa28f0998310578 WatchSource:0}: Error finding container f81231987c58c930d579cee292107a078099c24faa5841e85fa28f0998310578: Status 404 returned error can't find the container with id f81231987c58c930d579cee292107a078099c24faa5841e85fa28f0998310578 Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.220617 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ae6a7af-e7d7-440b-b7cb-366edba2d44e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.242803 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.302534 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.325572 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.344659 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.369198 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.456280 4775 generic.go:334] "Generic (PLEG): container finished" podID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerID="9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53" exitCode=0 Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.456393 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wpqn9" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.456396 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpqn9" event={"ID":"5415a9cc-8755-41e6-bd7b-1542339cadc6","Type":"ContainerDied","Data":"9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.456513 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wpqn9" event={"ID":"5415a9cc-8755-41e6-bd7b-1542339cadc6","Type":"ContainerDied","Data":"f7dc6e40e63c860fc724ef492981f5e211c90e6c7db158d9132d52f25b456767"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.456557 4775 scope.go:117] "RemoveContainer" containerID="9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.460114 4775 generic.go:334] "Generic (PLEG): container finished" podID="68158dce-8840-47f8-8dac-37abc28edc74" containerID="0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c" exitCode=0 Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.460249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" event={"ID":"68158dce-8840-47f8-8dac-37abc28edc74","Type":"ContainerDied","Data":"0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.460286 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" event={"ID":"68158dce-8840-47f8-8dac-37abc28edc74","Type":"ContainerDied","Data":"139296b53cfcbab11c8831abaf6a0db6d586bb1a2b9f552fe62be0a6c6fbf343"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.461666 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-krl46" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.463579 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" event={"ID":"fc92bcc5-aeca-4736-b861-e6f1540a15d1","Type":"ContainerStarted","Data":"6e2758023dde46428309b13cec59d9e922f01c1d8293042139dc3e97e5fea02d"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.463677 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" event={"ID":"fc92bcc5-aeca-4736-b861-e6f1540a15d1","Type":"ContainerStarted","Data":"f81231987c58c930d579cee292107a078099c24faa5841e85fa28f0998310578"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.464802 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.466677 4775 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qxmcq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" start-of-body= Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.466782 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" podUID="fc92bcc5-aeca-4736-b861-e6f1540a15d1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.57:8080/healthz\": dial tcp 10.217.0.57:8080: connect: connection refused" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.470672 4775 generic.go:334] "Generic (PLEG): container finished" podID="2b487540-88bb-496a-9aff-3f383cdc858b" containerID="a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" exitCode=0 Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.470779 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8snw" event={"ID":"2b487540-88bb-496a-9aff-3f383cdc858b","Type":"ContainerDied","Data":"a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.470826 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8snw" event={"ID":"2b487540-88bb-496a-9aff-3f383cdc858b","Type":"ContainerDied","Data":"1eec3f7497774ba660fe56e1601efacc89958991dbb3752466e04ed907d8b155"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.470953 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8snw" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.476889 4775 generic.go:334] "Generic (PLEG): container finished" podID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerID="05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27" exitCode=0 Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.477128 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v5q62" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.477129 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5q62" event={"ID":"3ae6a7af-e7d7-440b-b7cb-366edba2d44e","Type":"ContainerDied","Data":"05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.477255 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v5q62" event={"ID":"3ae6a7af-e7d7-440b-b7cb-366edba2d44e","Type":"ContainerDied","Data":"aada0f1adaa2b58806b9e0dc31f109b054a31ac70cb0eb0272c44c192348a37d"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.480229 4775 generic.go:334] "Generic (PLEG): container finished" podID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerID="2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" exitCode=0 Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.480293 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkb7p" event={"ID":"f1ecb76d-1e7c-4889-ab6d-451e8b534308","Type":"ContainerDied","Data":"2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.480338 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vkb7p" event={"ID":"f1ecb76d-1e7c-4889-ab6d-451e8b534308","Type":"ContainerDied","Data":"5e3718fa7769c29d58e7ea6f7af42eff70181f72f7af0705859deb32581a0268"} Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.480553 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vkb7p" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.483462 4775 scope.go:117] "RemoveContainer" containerID="6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.488065 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.500998 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" podStartSLOduration=7.500977385 podStartE2EDuration="7.500977385s" podCreationTimestamp="2026-01-27 11:24:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:25:05.496902582 +0000 UTC m=+284.638500419" watchObservedRunningTime="2026-01-27 11:25:05.500977385 +0000 UTC m=+284.642575162" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.513619 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.553993 4775 scope.go:117] "RemoveContainer" containerID="b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.566057 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.583725 4775 scope.go:117] "RemoveContainer" containerID="9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.584207 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53\": container with ID starting with 9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53 not found: ID does not exist" containerID="9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.584280 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53"} err="failed to get container status \"9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53\": rpc error: code = NotFound desc = could not find container \"9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53\": container with ID starting with 9f8ba1179e6ef95003c1b40bdd158d9fb9e77ab7efa60c5ed05c01aa487e3d53 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.585009 4775 scope.go:117] "RemoveContainer" containerID="6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.585510 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042\": container with ID starting with 6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042 not found: ID does not exist" containerID="6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.585551 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042"} err="failed to get container status \"6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042\": rpc error: code = NotFound desc = could not find container \"6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042\": container with ID starting with 6e0176dca3e29240e260a03350628f715110baa09e14bcc7b6cad5d1b39d9042 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.585578 4775 scope.go:117] "RemoveContainer" containerID="b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.585857 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e\": container with ID starting with b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e not found: ID does not exist" containerID="b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.585883 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e"} err="failed to get container status \"b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e\": rpc error: code = NotFound desc = could not find container \"b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e\": container with ID starting with b25210745c622d3cba1aa80dd9b60fd46f56f2da1bd54301841277d4247fe57e not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.585905 4775 scope.go:117] "RemoveContainer" containerID="0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.600141 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v5q62"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.601777 4775 scope.go:117] "RemoveContainer" containerID="0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.602959 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c\": container with ID starting with 0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c not found: ID does not exist" containerID="0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.603013 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c"} err="failed to get container status \"0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c\": rpc error: code = NotFound desc = could not find container \"0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c\": container with ID starting with 0a2714786dba76a28403af30d3f21d3f41909fcf8e45407144247fd537c7d84c not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.603042 4775 scope.go:117] "RemoveContainer" containerID="a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.608301 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v5q62"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.614420 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-krl46"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.620542 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-krl46"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.629030 4775 scope.go:117] "RemoveContainer" containerID="af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.630227 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vkb7p"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.638208 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vkb7p"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.646397 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpqn9"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.651490 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wpqn9"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.653917 4775 scope.go:117] "RemoveContainer" containerID="d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.654890 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s8snw"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.657645 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.658369 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s8snw"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.669789 4775 scope.go:117] "RemoveContainer" containerID="a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.670287 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c\": container with ID starting with a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c not found: ID does not exist" containerID="a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.670327 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c"} err="failed to get container status \"a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c\": rpc error: code = NotFound desc = could not find container \"a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c\": container with ID starting with a3ad444bb1f0b35d9b3eb55555d03fcbc9027ec235fa0e588ff51ee419d7829c not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.670356 4775 scope.go:117] "RemoveContainer" containerID="af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.670651 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6\": container with ID starting with af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6 not found: ID does not exist" containerID="af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.670677 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6"} err="failed to get container status \"af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6\": rpc error: code = NotFound desc = could not find container \"af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6\": container with ID starting with af8ce26692b81b0aff928a7ec1345f52fcdc2ba684467dc5146159365766b4d6 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.670692 4775 scope.go:117] "RemoveContainer" containerID="d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.670949 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05\": container with ID starting with d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05 not found: ID does not exist" containerID="d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.670993 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05"} err="failed to get container status \"d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05\": rpc error: code = NotFound desc = could not find container \"d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05\": container with ID starting with d4975ca616de05474ffba94cb3623e6e38fb81a99c64dc4cf2c7eabb55d48e05 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.671010 4775 scope.go:117] "RemoveContainer" containerID="05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.686305 4775 scope.go:117] "RemoveContainer" containerID="5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.713919 4775 scope.go:117] "RemoveContainer" containerID="d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.721041 4775 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.721276 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d" gracePeriod=5 Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.726804 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.752540 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" path="/var/lib/kubelet/pods/2b487540-88bb-496a-9aff-3f383cdc858b/volumes" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.753412 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" path="/var/lib/kubelet/pods/3ae6a7af-e7d7-440b-b7cb-366edba2d44e/volumes" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.754131 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" path="/var/lib/kubelet/pods/5415a9cc-8755-41e6-bd7b-1542339cadc6/volumes" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.755221 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68158dce-8840-47f8-8dac-37abc28edc74" path="/var/lib/kubelet/pods/68158dce-8840-47f8-8dac-37abc28edc74/volumes" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.755658 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" path="/var/lib/kubelet/pods/f1ecb76d-1e7c-4889-ab6d-451e8b534308/volumes" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.794406 4775 scope.go:117] "RemoveContainer" containerID="05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.795369 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27\": container with ID starting with 05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27 not found: ID does not exist" containerID="05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.795424 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27"} err="failed to get container status \"05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27\": rpc error: code = NotFound desc = could not find container \"05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27\": container with ID starting with 05aba5b7512aa1509a6a473646a0ccb64ecbac3d9a77ec96328f7ba0ac801d27 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.795486 4775 scope.go:117] "RemoveContainer" containerID="5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.796044 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79\": container with ID starting with 5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79 not found: ID does not exist" containerID="5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.796080 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79"} err="failed to get container status \"5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79\": rpc error: code = NotFound desc = could not find container \"5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79\": container with ID starting with 5e772025f63d78f4f859278d289d11865f30cd6b354636b4c816ec0a5b186d79 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.796107 4775 scope.go:117] "RemoveContainer" containerID="d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.796528 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196\": container with ID starting with d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196 not found: ID does not exist" containerID="d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.796554 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196"} err="failed to get container status \"d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196\": rpc error: code = NotFound desc = could not find container \"d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196\": container with ID starting with d23df7fd0a704759edfa13620f8546f754ac27751736844f5b203674edb61196 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.796570 4775 scope.go:117] "RemoveContainer" containerID="2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.811871 4775 scope.go:117] "RemoveContainer" containerID="0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.825731 4775 scope.go:117] "RemoveContainer" containerID="575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.838207 4775 scope.go:117] "RemoveContainer" containerID="2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.838745 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1\": container with ID starting with 2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1 not found: ID does not exist" containerID="2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.838826 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1"} err="failed to get container status \"2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1\": rpc error: code = NotFound desc = could not find container \"2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1\": container with ID starting with 2982a5700b9f54c2cd9c60b55564ee602e9a969c495b532cca4460a1ad627af1 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.838873 4775 scope.go:117] "RemoveContainer" containerID="0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.839283 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045\": container with ID starting with 0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045 not found: ID does not exist" containerID="0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.839364 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045"} err="failed to get container status \"0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045\": rpc error: code = NotFound desc = could not find container \"0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045\": container with ID starting with 0bf63fe385b9478386150daf3ba8d678fad4f20378cf61e3ba37a17dff17f045 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.839420 4775 scope.go:117] "RemoveContainer" containerID="575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1" Jan 27 11:25:05 crc kubenswrapper[4775]: E0127 11:25:05.840050 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1\": container with ID starting with 575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1 not found: ID does not exist" containerID="575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.840088 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1"} err="failed to get container status \"575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1\": rpc error: code = NotFound desc = could not find container \"575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1\": container with ID starting with 575964528cbb07677ff3fe3f47bd41e4551cd3d51b70588ce82b0edd887d32b1 not found: ID does not exist" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.896912 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.924190 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 27 11:25:05 crc kubenswrapper[4775]: I0127 11:25:05.996548 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.116309 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.128930 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.185165 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.233674 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.249952 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.286344 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.301658 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.362079 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.423059 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.493608 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qxmcq_fc92bcc5-aeca-4736-b861-e6f1540a15d1/marketplace-operator/0.log" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.493682 4775 generic.go:334] "Generic (PLEG): container finished" podID="fc92bcc5-aeca-4736-b861-e6f1540a15d1" containerID="6e2758023dde46428309b13cec59d9e922f01c1d8293042139dc3e97e5fea02d" exitCode=1 Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.493846 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" event={"ID":"fc92bcc5-aeca-4736-b861-e6f1540a15d1","Type":"ContainerDied","Data":"6e2758023dde46428309b13cec59d9e922f01c1d8293042139dc3e97e5fea02d"} Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.497315 4775 scope.go:117] "RemoveContainer" containerID="6e2758023dde46428309b13cec59d9e922f01c1d8293042139dc3e97e5fea02d" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.735070 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.809417 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.848483 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 27 11:25:06 crc kubenswrapper[4775]: I0127 11:25:06.879362 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.003227 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.150868 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.200971 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.272831 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.281948 4775 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.421123 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.455919 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.482924 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.528142 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qxmcq_fc92bcc5-aeca-4736-b861-e6f1540a15d1/marketplace-operator/1.log" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.528950 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qxmcq_fc92bcc5-aeca-4736-b861-e6f1540a15d1/marketplace-operator/0.log" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.529022 4775 generic.go:334] "Generic (PLEG): container finished" podID="fc92bcc5-aeca-4736-b861-e6f1540a15d1" containerID="f48e4b306fc5a52cf134c24fce1fb413a34a70a9426596eeaf9515ffe83b3320" exitCode=1 Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.529060 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" event={"ID":"fc92bcc5-aeca-4736-b861-e6f1540a15d1","Type":"ContainerDied","Data":"f48e4b306fc5a52cf134c24fce1fb413a34a70a9426596eeaf9515ffe83b3320"} Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.529106 4775 scope.go:117] "RemoveContainer" containerID="6e2758023dde46428309b13cec59d9e922f01c1d8293042139dc3e97e5fea02d" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.532997 4775 scope.go:117] "RemoveContainer" containerID="f48e4b306fc5a52cf134c24fce1fb413a34a70a9426596eeaf9515ffe83b3320" Jan 27 11:25:07 crc kubenswrapper[4775]: E0127 11:25:07.534823 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-qxmcq_openshift-marketplace(fc92bcc5-aeca-4736-b861-e6f1540a15d1)\"" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" podUID="fc92bcc5-aeca-4736-b861-e6f1540a15d1" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.578111 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.661549 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.817976 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.819193 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.838794 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.878155 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 27 11:25:07 crc kubenswrapper[4775]: I0127 11:25:07.958054 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.108961 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.119805 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.341675 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.427790 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.536143 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qxmcq_fc92bcc5-aeca-4736-b861-e6f1540a15d1/marketplace-operator/1.log" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.536618 4775 scope.go:117] "RemoveContainer" containerID="f48e4b306fc5a52cf134c24fce1fb413a34a70a9426596eeaf9515ffe83b3320" Jan 27 11:25:08 crc kubenswrapper[4775]: E0127 11:25:08.536852 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-qxmcq_openshift-marketplace(fc92bcc5-aeca-4736-b861-e6f1540a15d1)\"" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" podUID="fc92bcc5-aeca-4736-b861-e6f1540a15d1" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.749843 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.792888 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 27 11:25:08 crc kubenswrapper[4775]: I0127 11:25:08.845095 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 27 11:25:09 crc kubenswrapper[4775]: I0127 11:25:09.397085 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 27 11:25:09 crc kubenswrapper[4775]: I0127 11:25:09.399798 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 27 11:25:09 crc kubenswrapper[4775]: I0127 11:25:09.667625 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 27 11:25:09 crc kubenswrapper[4775]: I0127 11:25:09.991959 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.838292 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.838588 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891392 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891485 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891520 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891538 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891526 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891624 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891631 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891686 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891752 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891834 4775 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891852 4775 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891866 4775 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.891876 4775 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.900804 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:25:10 crc kubenswrapper[4775]: I0127 11:25:10.993191 4775 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.551894 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.551950 4775 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d" exitCode=137 Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.551997 4775 scope.go:117] "RemoveContainer" containerID="eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.552016 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.568377 4775 scope.go:117] "RemoveContainer" containerID="eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d" Jan 27 11:25:11 crc kubenswrapper[4775]: E0127 11:25:11.568770 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d\": container with ID starting with eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d not found: ID does not exist" containerID="eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.568916 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d"} err="failed to get container status \"eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d\": rpc error: code = NotFound desc = could not find container \"eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d\": container with ID starting with eb2d6528703536915ab12dc6e9751f92030449aaa2d76b92f23f092cd8b68c6d not found: ID does not exist" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.749894 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.750121 4775 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.760421 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.760465 4775 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3f309b33-e54b-48eb-a407-d8ac97d77f99" Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.763419 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 11:25:11 crc kubenswrapper[4775]: I0127 11:25:11.763475 4775 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3f309b33-e54b-48eb-a407-d8ac97d77f99" Jan 27 11:25:14 crc kubenswrapper[4775]: I0127 11:25:14.719199 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:14 crc kubenswrapper[4775]: I0127 11:25:14.720841 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:14 crc kubenswrapper[4775]: I0127 11:25:14.721505 4775 scope.go:117] "RemoveContainer" containerID="f48e4b306fc5a52cf134c24fce1fb413a34a70a9426596eeaf9515ffe83b3320" Jan 27 11:25:14 crc kubenswrapper[4775]: E0127 11:25:14.721801 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-qxmcq_openshift-marketplace(fc92bcc5-aeca-4736-b861-e6f1540a15d1)\"" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" podUID="fc92bcc5-aeca-4736-b861-e6f1540a15d1" Jan 27 11:25:15 crc kubenswrapper[4775]: I0127 11:25:15.575251 4775 scope.go:117] "RemoveContainer" containerID="f48e4b306fc5a52cf134c24fce1fb413a34a70a9426596eeaf9515ffe83b3320" Jan 27 11:25:15 crc kubenswrapper[4775]: E0127 11:25:15.576096 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"marketplace-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=marketplace-operator pod=marketplace-operator-79b997595-qxmcq_openshift-marketplace(fc92bcc5-aeca-4736-b861-e6f1540a15d1)\"" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" podUID="fc92bcc5-aeca-4736-b861-e6f1540a15d1" Jan 27 11:25:21 crc kubenswrapper[4775]: I0127 11:25:21.485069 4775 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 27 11:25:24 crc kubenswrapper[4775]: I0127 11:25:24.171944 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 27 11:25:25 crc kubenswrapper[4775]: I0127 11:25:25.094938 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 27 11:25:27 crc kubenswrapper[4775]: I0127 11:25:27.096591 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 27 11:25:28 crc kubenswrapper[4775]: I0127 11:25:28.745313 4775 scope.go:117] "RemoveContainer" containerID="f48e4b306fc5a52cf134c24fce1fb413a34a70a9426596eeaf9515ffe83b3320" Jan 27 11:25:29 crc kubenswrapper[4775]: I0127 11:25:29.088154 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 27 11:25:29 crc kubenswrapper[4775]: I0127 11:25:29.646765 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qxmcq_fc92bcc5-aeca-4736-b861-e6f1540a15d1/marketplace-operator/1.log" Jan 27 11:25:29 crc kubenswrapper[4775]: I0127 11:25:29.646815 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" event={"ID":"fc92bcc5-aeca-4736-b861-e6f1540a15d1","Type":"ContainerStarted","Data":"a34f4e43fcb3601e0ea180c4cd3adefda3908a7e136a44c587c94bd71f0e2c86"} Jan 27 11:25:29 crc kubenswrapper[4775]: I0127 11:25:29.647291 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:29 crc kubenswrapper[4775]: I0127 11:25:29.651284 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qxmcq" Jan 27 11:25:33 crc kubenswrapper[4775]: I0127 11:25:33.936892 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 27 11:25:38 crc kubenswrapper[4775]: I0127 11:25:38.979359 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 27 11:25:59 crc kubenswrapper[4775]: I0127 11:25:59.518181 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:25:59 crc kubenswrapper[4775]: I0127 11:25:59.518808 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:26:01 crc kubenswrapper[4775]: I0127 11:26:01.860559 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pg564"] Jan 27 11:26:01 crc kubenswrapper[4775]: I0127 11:26:01.861027 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" podUID="e1b6882d-984d-432b-b3df-101a6437371b" containerName="controller-manager" containerID="cri-o://c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e" gracePeriod=30 Jan 27 11:26:01 crc kubenswrapper[4775]: I0127 11:26:01.954517 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w"] Jan 27 11:26:01 crc kubenswrapper[4775]: I0127 11:26:01.954809 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" podUID="9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" containerName="route-controller-manager" containerID="cri-o://fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6" gracePeriod=30 Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.219008 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.278174 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.353684 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-proxy-ca-bundles\") pod \"e1b6882d-984d-432b-b3df-101a6437371b\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.353743 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljpnn\" (UniqueName: \"kubernetes.io/projected/e1b6882d-984d-432b-b3df-101a6437371b-kube-api-access-ljpnn\") pod \"e1b6882d-984d-432b-b3df-101a6437371b\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.353807 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-config\") pod \"e1b6882d-984d-432b-b3df-101a6437371b\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.353842 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-client-ca\") pod \"e1b6882d-984d-432b-b3df-101a6437371b\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.353876 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b6882d-984d-432b-b3df-101a6437371b-serving-cert\") pod \"e1b6882d-984d-432b-b3df-101a6437371b\" (UID: \"e1b6882d-984d-432b-b3df-101a6437371b\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.354662 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-client-ca" (OuterVolumeSpecName: "client-ca") pod "e1b6882d-984d-432b-b3df-101a6437371b" (UID: "e1b6882d-984d-432b-b3df-101a6437371b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.354767 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-config" (OuterVolumeSpecName: "config") pod "e1b6882d-984d-432b-b3df-101a6437371b" (UID: "e1b6882d-984d-432b-b3df-101a6437371b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.355079 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e1b6882d-984d-432b-b3df-101a6437371b" (UID: "e1b6882d-984d-432b-b3df-101a6437371b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.355134 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.359178 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1b6882d-984d-432b-b3df-101a6437371b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e1b6882d-984d-432b-b3df-101a6437371b" (UID: "e1b6882d-984d-432b-b3df-101a6437371b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.359966 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1b6882d-984d-432b-b3df-101a6437371b-kube-api-access-ljpnn" (OuterVolumeSpecName: "kube-api-access-ljpnn") pod "e1b6882d-984d-432b-b3df-101a6437371b" (UID: "e1b6882d-984d-432b-b3df-101a6437371b"). InnerVolumeSpecName "kube-api-access-ljpnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456208 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-config\") pod \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456261 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxf7k\" (UniqueName: \"kubernetes.io/projected/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-kube-api-access-gxf7k\") pod \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456338 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-serving-cert\") pod \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456376 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-client-ca\") pod \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\" (UID: \"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb\") " Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456642 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456655 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e1b6882d-984d-432b-b3df-101a6437371b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456664 4775 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e1b6882d-984d-432b-b3df-101a6437371b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.456674 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljpnn\" (UniqueName: \"kubernetes.io/projected/e1b6882d-984d-432b-b3df-101a6437371b-kube-api-access-ljpnn\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.457165 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-client-ca" (OuterVolumeSpecName: "client-ca") pod "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" (UID: "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.457201 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-config" (OuterVolumeSpecName: "config") pod "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" (UID: "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.459485 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-kube-api-access-gxf7k" (OuterVolumeSpecName: "kube-api-access-gxf7k") pod "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" (UID: "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb"). InnerVolumeSpecName "kube-api-access-gxf7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.461236 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" (UID: "9d2bf0be-df8b-4f40-a468-4d32ed97bbeb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.557609 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.557646 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxf7k\" (UniqueName: \"kubernetes.io/projected/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-kube-api-access-gxf7k\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.557660 4775 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.557669 4775 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646273 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7744f4db6d-kv9sd"] Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646484 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646497 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646507 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646512 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646521 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646527 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646537 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" containerName="route-controller-manager" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646542 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" containerName="route-controller-manager" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646549 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646554 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646564 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646570 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646579 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646585 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646594 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646599 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646609 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646614 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646623 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646628 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646635 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646641 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="extract-utilities" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646651 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646660 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646669 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1b6882d-984d-432b-b3df-101a6437371b" containerName="controller-manager" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646675 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1b6882d-984d-432b-b3df-101a6437371b" containerName="controller-manager" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646683 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646691 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646700 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646708 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="extract-content" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.646718 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68158dce-8840-47f8-8dac-37abc28edc74" containerName="marketplace-operator" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646726 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="68158dce-8840-47f8-8dac-37abc28edc74" containerName="marketplace-operator" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646824 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1b6882d-984d-432b-b3df-101a6437371b" containerName="controller-manager" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646839 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b487540-88bb-496a-9aff-3f383cdc858b" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646851 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1ecb76d-1e7c-4889-ab6d-451e8b534308" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646861 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ae6a7af-e7d7-440b-b7cb-366edba2d44e" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646874 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646882 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" containerName="route-controller-manager" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646894 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="68158dce-8840-47f8-8dac-37abc28edc74" containerName="marketplace-operator" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.646904 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5415a9cc-8755-41e6-bd7b-1542339cadc6" containerName="registry-server" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.647415 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.660908 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7744f4db6d-kv9sd"] Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.759534 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-config\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.759618 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a1706f3-485f-4023-aee7-43602de1dafe-serving-cert\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.759643 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-client-ca\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.759674 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq9xz\" (UniqueName: \"kubernetes.io/projected/7a1706f3-485f-4023-aee7-43602de1dafe-kube-api-access-mq9xz\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.760335 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-proxy-ca-bundles\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.835964 4775 generic.go:334] "Generic (PLEG): container finished" podID="e1b6882d-984d-432b-b3df-101a6437371b" containerID="c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e" exitCode=0 Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.836041 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" event={"ID":"e1b6882d-984d-432b-b3df-101a6437371b","Type":"ContainerDied","Data":"c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e"} Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.836071 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" event={"ID":"e1b6882d-984d-432b-b3df-101a6437371b","Type":"ContainerDied","Data":"f1d7f91efbd16850b79ed6c4723629965776aad4a43a007e3ed55d3f13cef28e"} Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.836092 4775 scope.go:117] "RemoveContainer" containerID="c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.836199 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pg564" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.839242 4775 generic.go:334] "Generic (PLEG): container finished" podID="9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" containerID="fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6" exitCode=0 Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.839310 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" event={"ID":"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb","Type":"ContainerDied","Data":"fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6"} Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.839342 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" event={"ID":"9d2bf0be-df8b-4f40-a468-4d32ed97bbeb","Type":"ContainerDied","Data":"ee61d306bb5f6310bfe18fb9eb63cdf67c00e9b26b5cdca100d7222a8e1ec7f1"} Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.839395 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.855903 4775 scope.go:117] "RemoveContainer" containerID="c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.856379 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e\": container with ID starting with c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e not found: ID does not exist" containerID="c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.856433 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e"} err="failed to get container status \"c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e\": rpc error: code = NotFound desc = could not find container \"c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e\": container with ID starting with c37af5aed8eb30c46ec5ed8da24c974e66807b1c9ce269f93718b4571400d95e not found: ID does not exist" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.856479 4775 scope.go:117] "RemoveContainer" containerID="fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.861980 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-config\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.862038 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a1706f3-485f-4023-aee7-43602de1dafe-serving-cert\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.862061 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-client-ca\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.862140 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq9xz\" (UniqueName: \"kubernetes.io/projected/7a1706f3-485f-4023-aee7-43602de1dafe-kube-api-access-mq9xz\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.862187 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-proxy-ca-bundles\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.863668 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-client-ca\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.863877 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-config\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.864934 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a1706f3-485f-4023-aee7-43602de1dafe-proxy-ca-bundles\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.867695 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a1706f3-485f-4023-aee7-43602de1dafe-serving-cert\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.872357 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pg564"] Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.876148 4775 scope.go:117] "RemoveContainer" containerID="fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6" Jan 27 11:26:02 crc kubenswrapper[4775]: E0127 11:26:02.880890 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6\": container with ID starting with fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6 not found: ID does not exist" containerID="fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.880984 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6"} err="failed to get container status \"fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6\": rpc error: code = NotFound desc = could not find container \"fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6\": container with ID starting with fab8da6f7893240c3fe303a664878c295dcbd474a336740ef33f90b0b9af8ac6 not found: ID does not exist" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.883226 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq9xz\" (UniqueName: \"kubernetes.io/projected/7a1706f3-485f-4023-aee7-43602de1dafe-kube-api-access-mq9xz\") pod \"controller-manager-7744f4db6d-kv9sd\" (UID: \"7a1706f3-485f-4023-aee7-43602de1dafe\") " pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.889591 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pg564"] Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.902461 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w"] Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.907306 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ssb2w"] Jan 27 11:26:02 crc kubenswrapper[4775]: I0127 11:26:02.967342 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.140022 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7744f4db6d-kv9sd"] Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.602022 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c"] Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.602988 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.605548 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.605804 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.605855 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.606267 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.607072 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.607114 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.616894 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c"] Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.752743 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d2bf0be-df8b-4f40-a468-4d32ed97bbeb" path="/var/lib/kubelet/pods/9d2bf0be-df8b-4f40-a468-4d32ed97bbeb/volumes" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.754136 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1b6882d-984d-432b-b3df-101a6437371b" path="/var/lib/kubelet/pods/e1b6882d-984d-432b-b3df-101a6437371b/volumes" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.772490 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ee55e6e-e4e4-4af7-9585-f033b6db6467-serving-cert\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.772563 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee55e6e-e4e4-4af7-9585-f033b6db6467-config\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.772708 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ee55e6e-e4e4-4af7-9585-f033b6db6467-client-ca\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.772753 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tms6z\" (UniqueName: \"kubernetes.io/projected/2ee55e6e-e4e4-4af7-9585-f033b6db6467-kube-api-access-tms6z\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.847532 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" event={"ID":"7a1706f3-485f-4023-aee7-43602de1dafe","Type":"ContainerStarted","Data":"66694f9a395796a14832316007f29fd012a06140e29b909ee8f4aafeaf542760"} Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.847596 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" event={"ID":"7a1706f3-485f-4023-aee7-43602de1dafe","Type":"ContainerStarted","Data":"fb130040fccc2dedda3fb909f2d67ff304e9ea9b06859e087a77f70e1d20fe9a"} Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.847864 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.857388 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.865019 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7744f4db6d-kv9sd" podStartSLOduration=1.864996868 podStartE2EDuration="1.864996868s" podCreationTimestamp="2026-01-27 11:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:26:03.864225426 +0000 UTC m=+343.005823233" watchObservedRunningTime="2026-01-27 11:26:03.864996868 +0000 UTC m=+343.006594645" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.873993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ee55e6e-e4e4-4af7-9585-f033b6db6467-client-ca\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.874056 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tms6z\" (UniqueName: \"kubernetes.io/projected/2ee55e6e-e4e4-4af7-9585-f033b6db6467-kube-api-access-tms6z\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.874992 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ee55e6e-e4e4-4af7-9585-f033b6db6467-serving-cert\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.875217 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee55e6e-e4e4-4af7-9585-f033b6db6467-config\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.875253 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2ee55e6e-e4e4-4af7-9585-f033b6db6467-client-ca\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.877232 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ee55e6e-e4e4-4af7-9585-f033b6db6467-config\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.883309 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ee55e6e-e4e4-4af7-9585-f033b6db6467-serving-cert\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.891894 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tms6z\" (UniqueName: \"kubernetes.io/projected/2ee55e6e-e4e4-4af7-9585-f033b6db6467-kube-api-access-tms6z\") pod \"route-controller-manager-6f86f9bf4f-6569c\" (UID: \"2ee55e6e-e4e4-4af7-9585-f033b6db6467\") " pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:03 crc kubenswrapper[4775]: I0127 11:26:03.953126 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:04 crc kubenswrapper[4775]: I0127 11:26:04.336196 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c"] Jan 27 11:26:04 crc kubenswrapper[4775]: I0127 11:26:04.858667 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" event={"ID":"2ee55e6e-e4e4-4af7-9585-f033b6db6467","Type":"ContainerStarted","Data":"535565bc215fc2416402a17e27a9411f7a70f23aa78bb5a5f6f062cc7581e0e3"} Jan 27 11:26:04 crc kubenswrapper[4775]: I0127 11:26:04.859085 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" event={"ID":"2ee55e6e-e4e4-4af7-9585-f033b6db6467","Type":"ContainerStarted","Data":"e2332df7f9c512bfc49faf3e7a13558997430836ccac094d607d01ccdd0283cb"} Jan 27 11:26:04 crc kubenswrapper[4775]: I0127 11:26:04.881202 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" podStartSLOduration=3.8811743 podStartE2EDuration="3.8811743s" podCreationTimestamp="2026-01-27 11:26:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:26:04.880263246 +0000 UTC m=+344.021861023" watchObservedRunningTime="2026-01-27 11:26:04.8811743 +0000 UTC m=+344.022772097" Jan 27 11:26:05 crc kubenswrapper[4775]: I0127 11:26:05.863957 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:05 crc kubenswrapper[4775]: I0127 11:26:05.869779 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f86f9bf4f-6569c" Jan 27 11:26:20 crc kubenswrapper[4775]: I0127 11:26:20.910975 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-klf7d"] Jan 27 11:26:20 crc kubenswrapper[4775]: I0127 11:26:20.913359 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:20 crc kubenswrapper[4775]: I0127 11:26:20.916673 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 11:26:20 crc kubenswrapper[4775]: I0127 11:26:20.920956 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-klf7d"] Jan 27 11:26:20 crc kubenswrapper[4775]: I0127 11:26:20.955288 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-catalog-content\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:20 crc kubenswrapper[4775]: I0127 11:26:20.955331 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-utilities\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:20 crc kubenswrapper[4775]: I0127 11:26:20.955352 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vqpb\" (UniqueName: \"kubernetes.io/projected/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-kube-api-access-5vqpb\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.055993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-utilities\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.056300 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vqpb\" (UniqueName: \"kubernetes.io/projected/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-kube-api-access-5vqpb\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.056536 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-catalog-content\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.057437 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-utilities\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.057739 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-catalog-content\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.080147 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vqpb\" (UniqueName: \"kubernetes.io/projected/30eb115d-82ef-4c37-8cf4-4f2945ad86c1-kube-api-access-5vqpb\") pod \"community-operators-klf7d\" (UID: \"30eb115d-82ef-4c37-8cf4-4f2945ad86c1\") " pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.235088 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.503307 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xbvgj"] Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.511077 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.519827 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.521819 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xbvgj"] Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.694428 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9db1a996-ad2f-460c-9d8d-cacc63c4924d-utilities\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.694767 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9db1a996-ad2f-460c-9d8d-cacc63c4924d-catalog-content\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.694914 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlx46\" (UniqueName: \"kubernetes.io/projected/9db1a996-ad2f-460c-9d8d-cacc63c4924d-kube-api-access-qlx46\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.706483 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-klf7d"] Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.795925 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9db1a996-ad2f-460c-9d8d-cacc63c4924d-utilities\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.796488 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9db1a996-ad2f-460c-9d8d-cacc63c4924d-catalog-content\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.796542 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9db1a996-ad2f-460c-9d8d-cacc63c4924d-utilities\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.797152 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9db1a996-ad2f-460c-9d8d-cacc63c4924d-catalog-content\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.797342 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlx46\" (UniqueName: \"kubernetes.io/projected/9db1a996-ad2f-460c-9d8d-cacc63c4924d-kube-api-access-qlx46\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.818070 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlx46\" (UniqueName: \"kubernetes.io/projected/9db1a996-ad2f-460c-9d8d-cacc63c4924d-kube-api-access-qlx46\") pod \"redhat-marketplace-xbvgj\" (UID: \"9db1a996-ad2f-460c-9d8d-cacc63c4924d\") " pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.834735 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.843984 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.950704 4775 generic.go:334] "Generic (PLEG): container finished" podID="30eb115d-82ef-4c37-8cf4-4f2945ad86c1" containerID="d421b49a5cb7396e4023242da08bce2682bc2eff68fa7ea941ad8a12eaa85899" exitCode=0 Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.950811 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klf7d" event={"ID":"30eb115d-82ef-4c37-8cf4-4f2945ad86c1","Type":"ContainerDied","Data":"d421b49a5cb7396e4023242da08bce2682bc2eff68fa7ea941ad8a12eaa85899"} Jan 27 11:26:21 crc kubenswrapper[4775]: I0127 11:26:21.950927 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klf7d" event={"ID":"30eb115d-82ef-4c37-8cf4-4f2945ad86c1","Type":"ContainerStarted","Data":"833c5e147fd7fffc42baff4a223b6090d5f0d820cd757247d85e38a51b1ba790"} Jan 27 11:26:22 crc kubenswrapper[4775]: I0127 11:26:22.242695 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xbvgj"] Jan 27 11:26:22 crc kubenswrapper[4775]: I0127 11:26:22.961953 4775 generic.go:334] "Generic (PLEG): container finished" podID="9db1a996-ad2f-460c-9d8d-cacc63c4924d" containerID="178a32703074e3fcf3b7fb9e371a5571616a222628b1e8b4d4d82ba09bb27c0b" exitCode=0 Jan 27 11:26:22 crc kubenswrapper[4775]: I0127 11:26:22.961998 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xbvgj" event={"ID":"9db1a996-ad2f-460c-9d8d-cacc63c4924d","Type":"ContainerDied","Data":"178a32703074e3fcf3b7fb9e371a5571616a222628b1e8b4d4d82ba09bb27c0b"} Jan 27 11:26:22 crc kubenswrapper[4775]: I0127 11:26:22.962326 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xbvgj" event={"ID":"9db1a996-ad2f-460c-9d8d-cacc63c4924d","Type":"ContainerStarted","Data":"8873508af937bdb58c2e53a1ae67ca35b860075c1bcfbb39a873af7354116971"} Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.300760 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-87qp8"] Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.301689 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.304501 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.323034 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-87qp8"] Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.418670 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzrnt\" (UniqueName: \"kubernetes.io/projected/c6ef80c4-f4f3-4ba1-b98e-63738725009d-kube-api-access-vzrnt\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.418740 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ef80c4-f4f3-4ba1-b98e-63738725009d-utilities\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.419084 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ef80c4-f4f3-4ba1-b98e-63738725009d-catalog-content\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.520184 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ef80c4-f4f3-4ba1-b98e-63738725009d-catalog-content\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.520244 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzrnt\" (UniqueName: \"kubernetes.io/projected/c6ef80c4-f4f3-4ba1-b98e-63738725009d-kube-api-access-vzrnt\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.520277 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ef80c4-f4f3-4ba1-b98e-63738725009d-utilities\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.520826 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6ef80c4-f4f3-4ba1-b98e-63738725009d-utilities\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.521091 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6ef80c4-f4f3-4ba1-b98e-63738725009d-catalog-content\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.545664 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzrnt\" (UniqueName: \"kubernetes.io/projected/c6ef80c4-f4f3-4ba1-b98e-63738725009d-kube-api-access-vzrnt\") pod \"redhat-operators-87qp8\" (UID: \"c6ef80c4-f4f3-4ba1-b98e-63738725009d\") " pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.618329 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.910672 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5mgmj"] Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.912242 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.914582 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.918359 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5mgmj"] Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.968739 4775 generic.go:334] "Generic (PLEG): container finished" podID="9db1a996-ad2f-460c-9d8d-cacc63c4924d" containerID="7cc05d44e18420d0e038bb2da39fac51d372552f66c7912faf9f6eafdeb37172" exitCode=0 Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.968810 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xbvgj" event={"ID":"9db1a996-ad2f-460c-9d8d-cacc63c4924d","Type":"ContainerDied","Data":"7cc05d44e18420d0e038bb2da39fac51d372552f66c7912faf9f6eafdeb37172"} Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.970495 4775 generic.go:334] "Generic (PLEG): container finished" podID="30eb115d-82ef-4c37-8cf4-4f2945ad86c1" containerID="76f3b2ac0d75f15ba59659ec2b0353e1c9e72bb8399df264a2d66cc5e85ed7f0" exitCode=0 Jan 27 11:26:23 crc kubenswrapper[4775]: I0127 11:26:23.970532 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klf7d" event={"ID":"30eb115d-82ef-4c37-8cf4-4f2945ad86c1","Type":"ContainerDied","Data":"76f3b2ac0d75f15ba59659ec2b0353e1c9e72bb8399df264a2d66cc5e85ed7f0"} Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.030041 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d8922-b4e4-4162-acbe-4294c4746204-utilities\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.030109 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np2n4\" (UniqueName: \"kubernetes.io/projected/b55d8922-b4e4-4162-acbe-4294c4746204-kube-api-access-np2n4\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.030369 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d8922-b4e4-4162-acbe-4294c4746204-catalog-content\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.034557 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-87qp8"] Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.131472 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d8922-b4e4-4162-acbe-4294c4746204-catalog-content\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.131541 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d8922-b4e4-4162-acbe-4294c4746204-utilities\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.131580 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np2n4\" (UniqueName: \"kubernetes.io/projected/b55d8922-b4e4-4162-acbe-4294c4746204-kube-api-access-np2n4\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.131995 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d8922-b4e4-4162-acbe-4294c4746204-utilities\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.132213 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d8922-b4e4-4162-acbe-4294c4746204-catalog-content\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.151387 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np2n4\" (UniqueName: \"kubernetes.io/projected/b55d8922-b4e4-4162-acbe-4294c4746204-kube-api-access-np2n4\") pod \"certified-operators-5mgmj\" (UID: \"b55d8922-b4e4-4162-acbe-4294c4746204\") " pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.240282 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.627366 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5mgmj"] Jan 27 11:26:24 crc kubenswrapper[4775]: W0127 11:26:24.632241 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb55d8922_b4e4_4162_acbe_4294c4746204.slice/crio-a96976e9917baef146251a37ceee6fa25e88f6995d20b06a3cd4c60449af18d4 WatchSource:0}: Error finding container a96976e9917baef146251a37ceee6fa25e88f6995d20b06a3cd4c60449af18d4: Status 404 returned error can't find the container with id a96976e9917baef146251a37ceee6fa25e88f6995d20b06a3cd4c60449af18d4 Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.976420 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-klf7d" event={"ID":"30eb115d-82ef-4c37-8cf4-4f2945ad86c1","Type":"ContainerStarted","Data":"d447e35293d37a3ba0d59c56f818a959d8bd43118847daa838d2007a0d225ec1"} Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.978392 4775 generic.go:334] "Generic (PLEG): container finished" podID="c6ef80c4-f4f3-4ba1-b98e-63738725009d" containerID="84c0689c056eae572aca7363d8bfb6f22824af6abde73b867420fd49d09493a1" exitCode=0 Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.978474 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qp8" event={"ID":"c6ef80c4-f4f3-4ba1-b98e-63738725009d","Type":"ContainerDied","Data":"84c0689c056eae572aca7363d8bfb6f22824af6abde73b867420fd49d09493a1"} Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.978504 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qp8" event={"ID":"c6ef80c4-f4f3-4ba1-b98e-63738725009d","Type":"ContainerStarted","Data":"89bcc0e9f79ab7eb0e16ab77a5809ef59247e944cba96054332e58ad5cb2f568"} Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.981378 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xbvgj" event={"ID":"9db1a996-ad2f-460c-9d8d-cacc63c4924d","Type":"ContainerStarted","Data":"8a84ae7c295fccb6d2e8d3f57355dfb9ce7579c64e403d106e552945c44b76ec"} Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.983080 4775 generic.go:334] "Generic (PLEG): container finished" podID="b55d8922-b4e4-4162-acbe-4294c4746204" containerID="1c3a3775845760503a5ad415560044cd02cceff592c3c5f46e29e42cc0b78917" exitCode=0 Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.983111 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5mgmj" event={"ID":"b55d8922-b4e4-4162-acbe-4294c4746204","Type":"ContainerDied","Data":"1c3a3775845760503a5ad415560044cd02cceff592c3c5f46e29e42cc0b78917"} Jan 27 11:26:24 crc kubenswrapper[4775]: I0127 11:26:24.983126 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5mgmj" event={"ID":"b55d8922-b4e4-4162-acbe-4294c4746204","Type":"ContainerStarted","Data":"a96976e9917baef146251a37ceee6fa25e88f6995d20b06a3cd4c60449af18d4"} Jan 27 11:26:25 crc kubenswrapper[4775]: I0127 11:26:25.000031 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-klf7d" podStartSLOduration=2.528553743 podStartE2EDuration="5.000015723s" podCreationTimestamp="2026-01-27 11:26:20 +0000 UTC" firstStartedPulling="2026-01-27 11:26:21.953569779 +0000 UTC m=+361.095167546" lastFinishedPulling="2026-01-27 11:26:24.425031739 +0000 UTC m=+363.566629526" observedRunningTime="2026-01-27 11:26:24.997773432 +0000 UTC m=+364.139371229" watchObservedRunningTime="2026-01-27 11:26:25.000015723 +0000 UTC m=+364.141613510" Jan 27 11:26:25 crc kubenswrapper[4775]: I0127 11:26:25.037882 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xbvgj" podStartSLOduration=2.6093850549999997 podStartE2EDuration="4.03785964s" podCreationTimestamp="2026-01-27 11:26:21 +0000 UTC" firstStartedPulling="2026-01-27 11:26:22.963595043 +0000 UTC m=+362.105192830" lastFinishedPulling="2026-01-27 11:26:24.392069638 +0000 UTC m=+363.533667415" observedRunningTime="2026-01-27 11:26:25.036562294 +0000 UTC m=+364.178160071" watchObservedRunningTime="2026-01-27 11:26:25.03785964 +0000 UTC m=+364.179457417" Jan 27 11:26:25 crc kubenswrapper[4775]: I0127 11:26:25.990487 4775 generic.go:334] "Generic (PLEG): container finished" podID="b55d8922-b4e4-4162-acbe-4294c4746204" containerID="6b59a05568990621ed5774ab56c43518dc693b1ac996548749222d0c3b8c40c0" exitCode=0 Jan 27 11:26:25 crc kubenswrapper[4775]: I0127 11:26:25.990580 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5mgmj" event={"ID":"b55d8922-b4e4-4162-acbe-4294c4746204","Type":"ContainerDied","Data":"6b59a05568990621ed5774ab56c43518dc693b1ac996548749222d0c3b8c40c0"} Jan 27 11:26:25 crc kubenswrapper[4775]: I0127 11:26:25.994529 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qp8" event={"ID":"c6ef80c4-f4f3-4ba1-b98e-63738725009d","Type":"ContainerStarted","Data":"a840ef808ea66d84b114c859330ee32b809535f32fe824249a97d8b9e00d2bd9"} Jan 27 11:26:27 crc kubenswrapper[4775]: I0127 11:26:27.001585 4775 generic.go:334] "Generic (PLEG): container finished" podID="c6ef80c4-f4f3-4ba1-b98e-63738725009d" containerID="a840ef808ea66d84b114c859330ee32b809535f32fe824249a97d8b9e00d2bd9" exitCode=0 Jan 27 11:26:27 crc kubenswrapper[4775]: I0127 11:26:27.001715 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qp8" event={"ID":"c6ef80c4-f4f3-4ba1-b98e-63738725009d","Type":"ContainerDied","Data":"a840ef808ea66d84b114c859330ee32b809535f32fe824249a97d8b9e00d2bd9"} Jan 27 11:26:27 crc kubenswrapper[4775]: I0127 11:26:27.004873 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5mgmj" event={"ID":"b55d8922-b4e4-4162-acbe-4294c4746204","Type":"ContainerStarted","Data":"c9e399f257b5d94e8b7181d8eadc767becdc80cda89752762b5cb2993685e8cd"} Jan 27 11:26:27 crc kubenswrapper[4775]: I0127 11:26:27.047184 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5mgmj" podStartSLOduration=2.579689559 podStartE2EDuration="4.047156581s" podCreationTimestamp="2026-01-27 11:26:23 +0000 UTC" firstStartedPulling="2026-01-27 11:26:24.984053063 +0000 UTC m=+364.125650840" lastFinishedPulling="2026-01-27 11:26:26.451520085 +0000 UTC m=+365.593117862" observedRunningTime="2026-01-27 11:26:27.041413551 +0000 UTC m=+366.183011328" watchObservedRunningTime="2026-01-27 11:26:27.047156581 +0000 UTC m=+366.188754358" Jan 27 11:26:28 crc kubenswrapper[4775]: I0127 11:26:28.010981 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-87qp8" event={"ID":"c6ef80c4-f4f3-4ba1-b98e-63738725009d","Type":"ContainerStarted","Data":"2223891d8828dce1d4d66852bc57cdea0e57d36d0f846608cc0b5785742e81b7"} Jan 27 11:26:28 crc kubenswrapper[4775]: I0127 11:26:28.036962 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-87qp8" podStartSLOduration=2.562585335 podStartE2EDuration="5.036919934s" podCreationTimestamp="2026-01-27 11:26:23 +0000 UTC" firstStartedPulling="2026-01-27 11:26:24.979728483 +0000 UTC m=+364.121326260" lastFinishedPulling="2026-01-27 11:26:27.454063082 +0000 UTC m=+366.595660859" observedRunningTime="2026-01-27 11:26:28.03133245 +0000 UTC m=+367.172930257" watchObservedRunningTime="2026-01-27 11:26:28.036919934 +0000 UTC m=+367.178517731" Jan 27 11:26:29 crc kubenswrapper[4775]: I0127 11:26:29.518235 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:26:29 crc kubenswrapper[4775]: I0127 11:26:29.518621 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:26:31 crc kubenswrapper[4775]: I0127 11:26:31.236044 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:31 crc kubenswrapper[4775]: I0127 11:26:31.236490 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:31 crc kubenswrapper[4775]: I0127 11:26:31.283464 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:31 crc kubenswrapper[4775]: I0127 11:26:31.845840 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:31 crc kubenswrapper[4775]: I0127 11:26:31.845901 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:31 crc kubenswrapper[4775]: I0127 11:26:31.888254 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:32 crc kubenswrapper[4775]: I0127 11:26:32.071631 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xbvgj" Jan 27 11:26:32 crc kubenswrapper[4775]: I0127 11:26:32.082878 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-klf7d" Jan 27 11:26:33 crc kubenswrapper[4775]: I0127 11:26:33.619658 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:33 crc kubenswrapper[4775]: I0127 11:26:33.620060 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:34 crc kubenswrapper[4775]: I0127 11:26:34.240815 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:34 crc kubenswrapper[4775]: I0127 11:26:34.241372 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:34 crc kubenswrapper[4775]: I0127 11:26:34.285136 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:34 crc kubenswrapper[4775]: I0127 11:26:34.687749 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-87qp8" podUID="c6ef80c4-f4f3-4ba1-b98e-63738725009d" containerName="registry-server" probeResult="failure" output=< Jan 27 11:26:34 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 11:26:34 crc kubenswrapper[4775]: > Jan 27 11:26:35 crc kubenswrapper[4775]: I0127 11:26:35.095274 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5mgmj" Jan 27 11:26:41 crc kubenswrapper[4775]: I0127 11:26:41.941386 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jh4wp"] Jan 27 11:26:41 crc kubenswrapper[4775]: I0127 11:26:41.944257 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.025901 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jh4wp"] Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065285 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-registry-certificates\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065355 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-registry-tls\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065388 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065481 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-trusted-ca\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065506 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c44wh\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-kube-api-access-c44wh\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065545 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065592 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.065647 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-bound-sa-token\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.105524 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.167097 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-trusted-ca\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.167290 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c44wh\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-kube-api-access-c44wh\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.167378 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.167474 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-bound-sa-token\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.167575 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-registry-certificates\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.167643 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-registry-tls\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.167709 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.168094 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-ca-trust-extracted\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.168331 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-trusted-ca\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.168649 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-registry-certificates\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.173078 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-installation-pull-secrets\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.179526 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-registry-tls\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.182164 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c44wh\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-kube-api-access-c44wh\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.182912 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ee2a4899-c4ef-40a2-aeac-2596fdc1b282-bound-sa-token\") pod \"image-registry-66df7c8f76-jh4wp\" (UID: \"ee2a4899-c4ef-40a2-aeac-2596fdc1b282\") " pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.260625 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:42 crc kubenswrapper[4775]: I0127 11:26:42.700177 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-jh4wp"] Jan 27 11:26:42 crc kubenswrapper[4775]: W0127 11:26:42.706766 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee2a4899_c4ef_40a2_aeac_2596fdc1b282.slice/crio-8ccbbfd68ad0ce753ff006033bddbb216e49e1b3084397cd518e4240e8d1f883 WatchSource:0}: Error finding container 8ccbbfd68ad0ce753ff006033bddbb216e49e1b3084397cd518e4240e8d1f883: Status 404 returned error can't find the container with id 8ccbbfd68ad0ce753ff006033bddbb216e49e1b3084397cd518e4240e8d1f883 Jan 27 11:26:43 crc kubenswrapper[4775]: I0127 11:26:43.090292 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" event={"ID":"ee2a4899-c4ef-40a2-aeac-2596fdc1b282","Type":"ContainerStarted","Data":"09eae0822eb6c795fdc683be48e277bc5e9c6501fad209398cc833bc2e5da80a"} Jan 27 11:26:43 crc kubenswrapper[4775]: I0127 11:26:43.090610 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" event={"ID":"ee2a4899-c4ef-40a2-aeac-2596fdc1b282","Type":"ContainerStarted","Data":"8ccbbfd68ad0ce753ff006033bddbb216e49e1b3084397cd518e4240e8d1f883"} Jan 27 11:26:43 crc kubenswrapper[4775]: I0127 11:26:43.090628 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:26:43 crc kubenswrapper[4775]: I0127 11:26:43.110253 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" podStartSLOduration=2.110234508 podStartE2EDuration="2.110234508s" podCreationTimestamp="2026-01-27 11:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:26:43.107634556 +0000 UTC m=+382.249232333" watchObservedRunningTime="2026-01-27 11:26:43.110234508 +0000 UTC m=+382.251832285" Jan 27 11:26:43 crc kubenswrapper[4775]: I0127 11:26:43.673557 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:43 crc kubenswrapper[4775]: I0127 11:26:43.716070 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-87qp8" Jan 27 11:26:59 crc kubenswrapper[4775]: I0127 11:26:59.518091 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:26:59 crc kubenswrapper[4775]: I0127 11:26:59.518905 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:26:59 crc kubenswrapper[4775]: I0127 11:26:59.518978 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:26:59 crc kubenswrapper[4775]: I0127 11:26:59.520179 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b93020ef7c9430606536756315c4ef1de229e2e6eaf460073cd42ad0825e59e8"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:26:59 crc kubenswrapper[4775]: I0127 11:26:59.520377 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://b93020ef7c9430606536756315c4ef1de229e2e6eaf460073cd42ad0825e59e8" gracePeriod=600 Jan 27 11:27:00 crc kubenswrapper[4775]: I0127 11:27:00.195362 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="b93020ef7c9430606536756315c4ef1de229e2e6eaf460073cd42ad0825e59e8" exitCode=0 Jan 27 11:27:00 crc kubenswrapper[4775]: I0127 11:27:00.195503 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"b93020ef7c9430606536756315c4ef1de229e2e6eaf460073cd42ad0825e59e8"} Jan 27 11:27:00 crc kubenswrapper[4775]: I0127 11:27:00.195645 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"b6bfc560dd2b425e637beb4eff36549cfb04f80cf81bd519c26996484ee2498d"} Jan 27 11:27:00 crc kubenswrapper[4775]: I0127 11:27:00.195667 4775 scope.go:117] "RemoveContainer" containerID="e40b981b74dbb5179c9a9c4b3ca6ec3675bd945bcf5d0c078c41b18e61051ce4" Jan 27 11:27:02 crc kubenswrapper[4775]: I0127 11:27:02.273883 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-jh4wp" Jan 27 11:27:02 crc kubenswrapper[4775]: I0127 11:27:02.354042 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b7lls"] Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.422685 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" podUID="cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" containerName="registry" containerID="cri-o://666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994" gracePeriod=30 Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.864392 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.958956 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-bound-sa-token\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.959330 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.959406 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-certificates\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.959514 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-installation-pull-secrets\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.959594 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-trusted-ca\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.959632 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-tls\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.959655 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-629ps\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-kube-api-access-629ps\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.960156 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-ca-trust-extracted\") pod \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\" (UID: \"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5\") " Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.960595 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.961004 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.964940 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.965274 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-kube-api-access-629ps" (OuterVolumeSpecName: "kube-api-access-629ps") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "kube-api-access-629ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.966044 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.978038 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.983381 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:27:27 crc kubenswrapper[4775]: I0127 11:27:27.992213 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" (UID: "cbb40aba-c103-4a72-abd7-3e5b3aaa82e5"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.061205 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.061238 4775 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.061249 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-629ps\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-kube-api-access-629ps\") on node \"crc\" DevicePath \"\"" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.061258 4775 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.061267 4775 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.061275 4775 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.061286 4775 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.381440 4775 generic.go:334] "Generic (PLEG): container finished" podID="cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" containerID="666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994" exitCode=0 Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.381538 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" event={"ID":"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5","Type":"ContainerDied","Data":"666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994"} Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.381579 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" event={"ID":"cbb40aba-c103-4a72-abd7-3e5b3aaa82e5","Type":"ContainerDied","Data":"8d37a2d435548adc351dbcf45235ea8b83864719085f8dffa0da9c361fa7f477"} Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.381610 4775 scope.go:117] "RemoveContainer" containerID="666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.381769 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-b7lls" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.408099 4775 scope.go:117] "RemoveContainer" containerID="666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994" Jan 27 11:27:28 crc kubenswrapper[4775]: E0127 11:27:28.408795 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994\": container with ID starting with 666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994 not found: ID does not exist" containerID="666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.408974 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994"} err="failed to get container status \"666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994\": rpc error: code = NotFound desc = could not find container \"666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994\": container with ID starting with 666add924bf28184dc90d22450a4bda575f1f99fdd5b7512c0bb3c31baeb1994 not found: ID does not exist" Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.440290 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b7lls"] Jan 27 11:27:28 crc kubenswrapper[4775]: I0127 11:27:28.448282 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-b7lls"] Jan 27 11:27:29 crc kubenswrapper[4775]: I0127 11:27:29.754639 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" path="/var/lib/kubelet/pods/cbb40aba-c103-4a72-abd7-3e5b3aaa82e5/volumes" Jan 27 11:28:59 crc kubenswrapper[4775]: I0127 11:28:59.517982 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:28:59 crc kubenswrapper[4775]: I0127 11:28:59.518674 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:29:29 crc kubenswrapper[4775]: I0127 11:29:29.518205 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:29:29 crc kubenswrapper[4775]: I0127 11:29:29.519023 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:29:59 crc kubenswrapper[4775]: I0127 11:29:59.517589 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:29:59 crc kubenswrapper[4775]: I0127 11:29:59.518297 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:29:59 crc kubenswrapper[4775]: I0127 11:29:59.518354 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:29:59 crc kubenswrapper[4775]: I0127 11:29:59.519349 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b6bfc560dd2b425e637beb4eff36549cfb04f80cf81bd519c26996484ee2498d"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:29:59 crc kubenswrapper[4775]: I0127 11:29:59.519488 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://b6bfc560dd2b425e637beb4eff36549cfb04f80cf81bd519c26996484ee2498d" gracePeriod=600 Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.214729 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj"] Jan 27 11:30:00 crc kubenswrapper[4775]: E0127 11:30:00.214986 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" containerName="registry" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.215001 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" containerName="registry" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.215158 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbb40aba-c103-4a72-abd7-3e5b3aaa82e5" containerName="registry" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.215677 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.218592 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.219502 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.240523 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj"] Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.332995 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-config-volume\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.333040 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf2x6\" (UniqueName: \"kubernetes.io/projected/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-kube-api-access-zf2x6\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.333073 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-secret-volume\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.353482 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="b6bfc560dd2b425e637beb4eff36549cfb04f80cf81bd519c26996484ee2498d" exitCode=0 Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.353532 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"b6bfc560dd2b425e637beb4eff36549cfb04f80cf81bd519c26996484ee2498d"} Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.353570 4775 scope.go:117] "RemoveContainer" containerID="b93020ef7c9430606536756315c4ef1de229e2e6eaf460073cd42ad0825e59e8" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.434642 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-config-volume\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.434691 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf2x6\" (UniqueName: \"kubernetes.io/projected/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-kube-api-access-zf2x6\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.434723 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-secret-volume\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.437944 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-config-volume\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.441098 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-secret-volume\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.454557 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf2x6\" (UniqueName: \"kubernetes.io/projected/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-kube-api-access-zf2x6\") pod \"collect-profiles-29491890-4glmj\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.549608 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:00 crc kubenswrapper[4775]: I0127 11:30:00.782907 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj"] Jan 27 11:30:00 crc kubenswrapper[4775]: W0127 11:30:00.788893 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fb6e2d5_5884_4a3b_84a1_88a5ee052da9.slice/crio-5d19b0ec82f58acd6b7292740a646c97c68e599ea09a849b9ef9026933405c8a WatchSource:0}: Error finding container 5d19b0ec82f58acd6b7292740a646c97c68e599ea09a849b9ef9026933405c8a: Status 404 returned error can't find the container with id 5d19b0ec82f58acd6b7292740a646c97c68e599ea09a849b9ef9026933405c8a Jan 27 11:30:01 crc kubenswrapper[4775]: I0127 11:30:01.363141 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"2871a1c3582de4c70e2186866f517a9085c1741422622dc5d1e02969b09f93ad"} Jan 27 11:30:01 crc kubenswrapper[4775]: I0127 11:30:01.365574 4775 generic.go:334] "Generic (PLEG): container finished" podID="2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" containerID="0f3580828c538a1fd2620d795cca4ebbc4512c90dd73f2436a5638637886ada1" exitCode=0 Jan 27 11:30:01 crc kubenswrapper[4775]: I0127 11:30:01.365632 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" event={"ID":"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9","Type":"ContainerDied","Data":"0f3580828c538a1fd2620d795cca4ebbc4512c90dd73f2436a5638637886ada1"} Jan 27 11:30:01 crc kubenswrapper[4775]: I0127 11:30:01.365810 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" event={"ID":"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9","Type":"ContainerStarted","Data":"5d19b0ec82f58acd6b7292740a646c97c68e599ea09a849b9ef9026933405c8a"} Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.658772 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.758200 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf2x6\" (UniqueName: \"kubernetes.io/projected/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-kube-api-access-zf2x6\") pod \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.758384 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-config-volume\") pod \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.758946 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-secret-volume\") pod \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\" (UID: \"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9\") " Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.759435 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-config-volume" (OuterVolumeSpecName: "config-volume") pod "2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" (UID: "2fb6e2d5-5884-4a3b-84a1-88a5ee052da9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.765131 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-kube-api-access-zf2x6" (OuterVolumeSpecName: "kube-api-access-zf2x6") pod "2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" (UID: "2fb6e2d5-5884-4a3b-84a1-88a5ee052da9"). InnerVolumeSpecName "kube-api-access-zf2x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.765571 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" (UID: "2fb6e2d5-5884-4a3b-84a1-88a5ee052da9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.861375 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.861442 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf2x6\" (UniqueName: \"kubernetes.io/projected/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-kube-api-access-zf2x6\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:02 crc kubenswrapper[4775]: I0127 11:30:02.861487 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:03 crc kubenswrapper[4775]: I0127 11:30:03.379638 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" event={"ID":"2fb6e2d5-5884-4a3b-84a1-88a5ee052da9","Type":"ContainerDied","Data":"5d19b0ec82f58acd6b7292740a646c97c68e599ea09a849b9ef9026933405c8a"} Jan 27 11:30:03 crc kubenswrapper[4775]: I0127 11:30:03.379698 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d19b0ec82f58acd6b7292740a646c97c68e599ea09a849b9ef9026933405c8a" Jan 27 11:30:03 crc kubenswrapper[4775]: I0127 11:30:03.379751 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.512533 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-xpr9c"] Jan 27 11:30:24 crc kubenswrapper[4775]: E0127 11:30:24.513214 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" containerName="collect-profiles" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.513231 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" containerName="collect-profiles" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.513356 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" containerName="collect-profiles" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.513828 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-xpr9c" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.516160 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.516240 4775 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-xnp7l" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.516331 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.519331 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-5w45m"] Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.520075 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.522667 4775 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-99fqq" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.539672 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k"] Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.543416 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.555137 4775 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-9gbvg" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.561707 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-5w45m"] Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.573861 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-xpr9c"] Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.578221 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k"] Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.625594 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf8cs\" (UniqueName: \"kubernetes.io/projected/ea378b66-945f-4832-b293-59576474b63c-kube-api-access-pf8cs\") pod \"cert-manager-cainjector-cf98fcc89-4sq7k\" (UID: \"ea378b66-945f-4832-b293-59576474b63c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.625664 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mps57\" (UniqueName: \"kubernetes.io/projected/6b64e5cd-1b80-489b-8d69-3ebf7862eb9f-kube-api-access-mps57\") pod \"cert-manager-858654f9db-xpr9c\" (UID: \"6b64e5cd-1b80-489b-8d69-3ebf7862eb9f\") " pod="cert-manager/cert-manager-858654f9db-xpr9c" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.625733 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld6lp\" (UniqueName: \"kubernetes.io/projected/882dbf86-77c4-46a5-a75b-b7b4a70d3ac1-kube-api-access-ld6lp\") pod \"cert-manager-webhook-687f57d79b-5w45m\" (UID: \"882dbf86-77c4-46a5-a75b-b7b4a70d3ac1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.726817 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf8cs\" (UniqueName: \"kubernetes.io/projected/ea378b66-945f-4832-b293-59576474b63c-kube-api-access-pf8cs\") pod \"cert-manager-cainjector-cf98fcc89-4sq7k\" (UID: \"ea378b66-945f-4832-b293-59576474b63c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.726887 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mps57\" (UniqueName: \"kubernetes.io/projected/6b64e5cd-1b80-489b-8d69-3ebf7862eb9f-kube-api-access-mps57\") pod \"cert-manager-858654f9db-xpr9c\" (UID: \"6b64e5cd-1b80-489b-8d69-3ebf7862eb9f\") " pod="cert-manager/cert-manager-858654f9db-xpr9c" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.726954 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld6lp\" (UniqueName: \"kubernetes.io/projected/882dbf86-77c4-46a5-a75b-b7b4a70d3ac1-kube-api-access-ld6lp\") pod \"cert-manager-webhook-687f57d79b-5w45m\" (UID: \"882dbf86-77c4-46a5-a75b-b7b4a70d3ac1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.746904 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf8cs\" (UniqueName: \"kubernetes.io/projected/ea378b66-945f-4832-b293-59576474b63c-kube-api-access-pf8cs\") pod \"cert-manager-cainjector-cf98fcc89-4sq7k\" (UID: \"ea378b66-945f-4832-b293-59576474b63c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.747219 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mps57\" (UniqueName: \"kubernetes.io/projected/6b64e5cd-1b80-489b-8d69-3ebf7862eb9f-kube-api-access-mps57\") pod \"cert-manager-858654f9db-xpr9c\" (UID: \"6b64e5cd-1b80-489b-8d69-3ebf7862eb9f\") " pod="cert-manager/cert-manager-858654f9db-xpr9c" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.749003 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld6lp\" (UniqueName: \"kubernetes.io/projected/882dbf86-77c4-46a5-a75b-b7b4a70d3ac1-kube-api-access-ld6lp\") pod \"cert-manager-webhook-687f57d79b-5w45m\" (UID: \"882dbf86-77c4-46a5-a75b-b7b4a70d3ac1\") " pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.831338 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-xpr9c" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.856714 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" Jan 27 11:30:24 crc kubenswrapper[4775]: I0127 11:30:24.871707 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" Jan 27 11:30:25 crc kubenswrapper[4775]: I0127 11:30:25.110074 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-5w45m"] Jan 27 11:30:25 crc kubenswrapper[4775]: W0127 11:30:25.127635 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod882dbf86_77c4_46a5_a75b_b7b4a70d3ac1.slice/crio-8e516a48a5283c59824a01da1767452001b6c3437c650aad1db8ee974a5dcec1 WatchSource:0}: Error finding container 8e516a48a5283c59824a01da1767452001b6c3437c650aad1db8ee974a5dcec1: Status 404 returned error can't find the container with id 8e516a48a5283c59824a01da1767452001b6c3437c650aad1db8ee974a5dcec1 Jan 27 11:30:25 crc kubenswrapper[4775]: I0127 11:30:25.134619 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 11:30:25 crc kubenswrapper[4775]: I0127 11:30:25.225711 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-xpr9c"] Jan 27 11:30:25 crc kubenswrapper[4775]: W0127 11:30:25.228390 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b64e5cd_1b80_489b_8d69_3ebf7862eb9f.slice/crio-cc5bfce37ba03eebb6f0fb25d88cd1f29f89ecc9ffd93d0c24f9572eb7c125d1 WatchSource:0}: Error finding container cc5bfce37ba03eebb6f0fb25d88cd1f29f89ecc9ffd93d0c24f9572eb7c125d1: Status 404 returned error can't find the container with id cc5bfce37ba03eebb6f0fb25d88cd1f29f89ecc9ffd93d0c24f9572eb7c125d1 Jan 27 11:30:25 crc kubenswrapper[4775]: I0127 11:30:25.345985 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k"] Jan 27 11:30:25 crc kubenswrapper[4775]: W0127 11:30:25.352348 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea378b66_945f_4832_b293_59576474b63c.slice/crio-8ba3c121ed43ec9490ca405d7f2c29b4658843363deda6eeac4101b849813c0f WatchSource:0}: Error finding container 8ba3c121ed43ec9490ca405d7f2c29b4658843363deda6eeac4101b849813c0f: Status 404 returned error can't find the container with id 8ba3c121ed43ec9490ca405d7f2c29b4658843363deda6eeac4101b849813c0f Jan 27 11:30:25 crc kubenswrapper[4775]: I0127 11:30:25.506794 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" event={"ID":"882dbf86-77c4-46a5-a75b-b7b4a70d3ac1","Type":"ContainerStarted","Data":"8e516a48a5283c59824a01da1767452001b6c3437c650aad1db8ee974a5dcec1"} Jan 27 11:30:25 crc kubenswrapper[4775]: I0127 11:30:25.507519 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" event={"ID":"ea378b66-945f-4832-b293-59576474b63c","Type":"ContainerStarted","Data":"8ba3c121ed43ec9490ca405d7f2c29b4658843363deda6eeac4101b849813c0f"} Jan 27 11:30:25 crc kubenswrapper[4775]: I0127 11:30:25.508869 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-xpr9c" event={"ID":"6b64e5cd-1b80-489b-8d69-3ebf7862eb9f","Type":"ContainerStarted","Data":"cc5bfce37ba03eebb6f0fb25d88cd1f29f89ecc9ffd93d0c24f9572eb7c125d1"} Jan 27 11:30:28 crc kubenswrapper[4775]: I0127 11:30:28.528995 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-xpr9c" event={"ID":"6b64e5cd-1b80-489b-8d69-3ebf7862eb9f","Type":"ContainerStarted","Data":"c225bad7fda744ee0a152305e2c2dde6fc4462a36e0f2f46dda60d4df60d3c37"} Jan 27 11:30:28 crc kubenswrapper[4775]: I0127 11:30:28.532023 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" event={"ID":"882dbf86-77c4-46a5-a75b-b7b4a70d3ac1","Type":"ContainerStarted","Data":"1db14bf2755bae4b1b89b0075cbbaffd66899b5d89d451e3607c287c4f08e3ee"} Jan 27 11:30:28 crc kubenswrapper[4775]: I0127 11:30:28.532150 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" Jan 27 11:30:28 crc kubenswrapper[4775]: I0127 11:30:28.544818 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-xpr9c" podStartSLOduration=1.499885586 podStartE2EDuration="4.544793059s" podCreationTimestamp="2026-01-27 11:30:24 +0000 UTC" firstStartedPulling="2026-01-27 11:30:25.231171593 +0000 UTC m=+604.372769370" lastFinishedPulling="2026-01-27 11:30:28.276079026 +0000 UTC m=+607.417676843" observedRunningTime="2026-01-27 11:30:28.541263577 +0000 UTC m=+607.682861354" watchObservedRunningTime="2026-01-27 11:30:28.544793059 +0000 UTC m=+607.686390846" Jan 27 11:30:28 crc kubenswrapper[4775]: I0127 11:30:28.572342 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" podStartSLOduration=1.422662092 podStartE2EDuration="4.572322391s" podCreationTimestamp="2026-01-27 11:30:24 +0000 UTC" firstStartedPulling="2026-01-27 11:30:25.134394056 +0000 UTC m=+604.275991823" lastFinishedPulling="2026-01-27 11:30:28.284054305 +0000 UTC m=+607.425652122" observedRunningTime="2026-01-27 11:30:28.566050996 +0000 UTC m=+607.707648763" watchObservedRunningTime="2026-01-27 11:30:28.572322391 +0000 UTC m=+607.713920178" Jan 27 11:30:29 crc kubenswrapper[4775]: I0127 11:30:29.540663 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" event={"ID":"ea378b66-945f-4832-b293-59576474b63c","Type":"ContainerStarted","Data":"c833c29eb74e627ad9fea30411217a37db682ae9fd5c2c2a4a2c0094511ed59b"} Jan 27 11:30:29 crc kubenswrapper[4775]: I0127 11:30:29.566495 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-4sq7k" podStartSLOduration=1.855251481 podStartE2EDuration="5.566438939s" podCreationTimestamp="2026-01-27 11:30:24 +0000 UTC" firstStartedPulling="2026-01-27 11:30:25.353713755 +0000 UTC m=+604.495311572" lastFinishedPulling="2026-01-27 11:30:29.064901253 +0000 UTC m=+608.206499030" observedRunningTime="2026-01-27 11:30:29.56042176 +0000 UTC m=+608.702019557" watchObservedRunningTime="2026-01-27 11:30:29.566438939 +0000 UTC m=+608.708036746" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.272415 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nzthg"] Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.273937 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-controller" containerID="cri-o://109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.274138 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-acl-logging" containerID="cri-o://22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.274228 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="sbdb" containerID="cri-o://f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.274566 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.274609 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="nbdb" containerID="cri-o://46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.274631 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="northd" containerID="cri-o://491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.274649 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-node" containerID="cri-o://2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.328847 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" containerID="cri-o://fff264ae37c862c92f04505830404488875026a16f9b83753ca7e41d83f2d007" gracePeriod=30 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.574052 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovnkube-controller/3.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.574743 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/1.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.577513 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/0.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578330 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-controller/0.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578903 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="fff264ae37c862c92f04505830404488875026a16f9b83753ca7e41d83f2d007" exitCode=0 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578934 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693" exitCode=143 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578944 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072" exitCode=0 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578954 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e" exitCode=0 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578963 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500" exitCode=0 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578971 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333" exitCode=0 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578970 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"fff264ae37c862c92f04505830404488875026a16f9b83753ca7e41d83f2d007"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579011 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.578980 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04" exitCode=0 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579102 4775 generic.go:334] "Generic (PLEG): container finished" podID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerID="109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5" exitCode=143 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579042 4775 scope.go:117] "RemoveContainer" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579030 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579356 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579371 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579383 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579394 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579407 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579419 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" event={"ID":"7d657d41-09b6-43f2-babb-4cb13a62fd1f","Type":"ContainerDied","Data":"f888bf350c80a3614a432edcc4a4b855273dcb2c8f4a4adedcb465a13b969229"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.579431 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f888bf350c80a3614a432edcc4a4b855273dcb2c8f4a4adedcb465a13b969229" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.581324 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/2.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.581866 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/1.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.581909 4775 generic.go:334] "Generic (PLEG): container finished" podID="aba2edc6-0e64-4995-830d-e177919ea13e" containerID="bcc243e4b73c14109c2dd74058668508df08b94a8ab3ccb4e2fac0e77e263f09" exitCode=2 Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.581934 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerDied","Data":"bcc243e4b73c14109c2dd74058668508df08b94a8ab3ccb4e2fac0e77e263f09"} Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.582621 4775 scope.go:117] "RemoveContainer" containerID="bcc243e4b73c14109c2dd74058668508df08b94a8ab3ccb4e2fac0e77e263f09" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.582998 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-gm7w4_openshift-multus(aba2edc6-0e64-4995-830d-e177919ea13e)\"" pod="openshift-multus/multus-gm7w4" podUID="aba2edc6-0e64-4995-830d-e177919ea13e" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.611121 4775 scope.go:117] "RemoveContainer" containerID="da7f6adb715a2fa0ac12d90a6730a260ca418e7747933510e8b739d3819d0044" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.611999 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c\": container with ID starting with aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c not found: ID does not exist" containerID="aff7c08d9328213ec8a7f6fadde872ccd07c469326526f871185509ccf13571c" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.612853 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/1.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.617590 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-controller/0.log" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.618255 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.631205 4775 scope.go:117] "RemoveContainer" containerID="750a2bbab27182907359a500a80a4d0be1d667b9a8eb1904246cf378c193f298" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673037 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mzqrg"] Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673327 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="nbdb" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673354 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="nbdb" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673374 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673386 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673398 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673412 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673429 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673440 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673524 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673659 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673680 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673693 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673711 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-acl-logging" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673722 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-acl-logging" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673741 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-node" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673752 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-node" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673771 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kubecfg-setup" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673783 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kubecfg-setup" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673797 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="sbdb" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673807 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="sbdb" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673825 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673835 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.673851 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="northd" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.673864 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="northd" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674025 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="nbdb" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674042 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-acl-logging" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674059 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674070 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674087 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="sbdb" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674103 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-acl-logging" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674116 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674128 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674142 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674153 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="kube-rbac-proxy-node" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674171 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674184 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="northd" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.674356 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674370 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: E0127 11:30:34.674383 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-acl-logging" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674394 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovn-acl-logging" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.674582 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" containerName="ovnkube-controller" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.676857 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755681 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovn-node-metrics-cert\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755738 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-etc-openvswitch\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755768 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-ovn-kubernetes\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755790 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-netd\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755819 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-netns\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755846 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-env-overrides\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755868 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czdm4\" (UniqueName: \"kubernetes.io/projected/7d657d41-09b6-43f2-babb-4cb13a62fd1f-kube-api-access-czdm4\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755895 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-var-lib-openvswitch\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755888 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755921 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-config\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755941 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755951 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-log-socket\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755975 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-openvswitch\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.755998 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-systemd\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756020 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-bin\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756043 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-node-log\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756064 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-kubelet\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756102 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-systemd-units\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756134 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-ovn\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756165 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756198 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-script-lib\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756221 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-slash\") pod \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\" (UID: \"7d657d41-09b6-43f2-babb-4cb13a62fd1f\") " Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756340 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756381 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-slash" (OuterVolumeSpecName: "host-slash") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756413 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-slash\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756442 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gpt5\" (UniqueName: \"kubernetes.io/projected/fae72616-e516-4ce6-86b8-b28f14a92939-kube-api-access-6gpt5\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756479 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-systemd\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756524 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756527 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-cni-bin\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756569 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756584 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756597 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756615 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756632 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-log-socket" (OuterVolumeSpecName: "log-socket") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756633 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756688 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fae72616-e516-4ce6-86b8-b28f14a92939-ovn-node-metrics-cert\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756705 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756741 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756756 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-node-log" (OuterVolumeSpecName: "node-log") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756787 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756817 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756879 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756930 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.756948 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-var-lib-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757063 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-kubelet\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757127 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-ovn\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757215 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-ovnkube-config\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757239 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757268 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-systemd-units\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757307 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-run-netns\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757487 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-etc-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757525 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-run-ovn-kubernetes\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757572 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-node-log\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757614 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-log-socket\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757670 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-cni-netd\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757688 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-ovnkube-script-lib\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757706 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-env-overrides\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757783 4775 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757796 4775 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757807 4775 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757818 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757828 4775 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-slash\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757837 4775 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757847 4775 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757855 4775 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757863 4775 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757871 4775 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757879 4775 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757887 4775 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.757897 4775 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-log-socket\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.758150 4775 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.758159 4775 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.758167 4775 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.758174 4775 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-node-log\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.761183 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d657d41-09b6-43f2-babb-4cb13a62fd1f-kube-api-access-czdm4" (OuterVolumeSpecName: "kube-api-access-czdm4") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "kube-api-access-czdm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.761260 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.768011 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "7d657d41-09b6-43f2-babb-4cb13a62fd1f" (UID: "7d657d41-09b6-43f2-babb-4cb13a62fd1f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.859782 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-ovn\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.859888 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-ovnkube-config\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.859923 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-systemd-units\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.859934 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-5w45m" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.859979 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-run-netns\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.859984 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-ovn\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.859947 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-run-netns\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860025 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-systemd-units\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860078 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-etc-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860120 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-etc-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860125 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-node-log\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860143 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-run-ovn-kubernetes\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860165 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-log-socket\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860193 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-cni-netd\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860211 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-ovnkube-script-lib\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860221 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-log-socket\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860222 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-run-ovn-kubernetes\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860231 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-env-overrides\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860269 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-node-log\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860250 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-cni-netd\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860377 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-slash\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860413 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gpt5\" (UniqueName: \"kubernetes.io/projected/fae72616-e516-4ce6-86b8-b28f14a92939-kube-api-access-6gpt5\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860439 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-systemd\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860555 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-cni-bin\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860574 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860613 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fae72616-e516-4ce6-86b8-b28f14a92939-ovn-node-metrics-cert\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860644 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860664 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-var-lib-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860797 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-kubelet\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860847 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-env-overrides\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860881 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7d657d41-09b6-43f2-babb-4cb13a62fd1f-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860884 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-kubelet\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860894 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czdm4\" (UniqueName: \"kubernetes.io/projected/7d657d41-09b6-43f2-babb-4cb13a62fd1f-kube-api-access-czdm4\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.860907 4775 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7d657d41-09b6-43f2-babb-4cb13a62fd1f-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861133 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-ovnkube-script-lib\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861169 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861261 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-cni-bin\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861325 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-slash\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861349 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861509 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-var-lib-openvswitch\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861587 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fae72616-e516-4ce6-86b8-b28f14a92939-ovnkube-config\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.861668 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fae72616-e516-4ce6-86b8-b28f14a92939-run-systemd\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.863974 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fae72616-e516-4ce6-86b8-b28f14a92939-ovn-node-metrics-cert\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.879021 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gpt5\" (UniqueName: \"kubernetes.io/projected/fae72616-e516-4ce6-86b8-b28f14a92939-kube-api-access-6gpt5\") pod \"ovnkube-node-mzqrg\" (UID: \"fae72616-e516-4ce6-86b8-b28f14a92939\") " pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:34 crc kubenswrapper[4775]: I0127 11:30:34.988798 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.589537 4775 generic.go:334] "Generic (PLEG): container finished" podID="fae72616-e516-4ce6-86b8-b28f14a92939" containerID="d16ee06f4b6448af85e17a5a56c4db31922ee5f3324e04e6e548d036b2cbe3a3" exitCode=0 Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.589621 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerDied","Data":"d16ee06f4b6448af85e17a5a56c4db31922ee5f3324e04e6e548d036b2cbe3a3"} Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.589662 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"09cfe9078423751d584eee942954fb930c2d766b5a623b8f7aba105e569f907d"} Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.593701 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-acl-logging/1.log" Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.597208 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nzthg_7d657d41-09b6-43f2-babb-4cb13a62fd1f/ovn-controller/0.log" Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.597764 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nzthg" Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.599247 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/2.log" Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.718718 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nzthg"] Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.724639 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nzthg"] Jan 27 11:30:35 crc kubenswrapper[4775]: I0127 11:30:35.752467 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d657d41-09b6-43f2-babb-4cb13a62fd1f" path="/var/lib/kubelet/pods/7d657d41-09b6-43f2-babb-4cb13a62fd1f/volumes" Jan 27 11:30:36 crc kubenswrapper[4775]: I0127 11:30:36.610582 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"89f663a6b0297d93e84b18f60117d31df8f6eb622689e23db150f2704fe5cf7a"} Jan 27 11:30:36 crc kubenswrapper[4775]: I0127 11:30:36.610910 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"c461e08ca691633285c1043270a5de4e7e99dcb43dc1b92b3f2a4e5473ec2105"} Jan 27 11:30:36 crc kubenswrapper[4775]: I0127 11:30:36.610923 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"189d778b681818ad736a67d73c2ed45c58d46018ab1188ddd77713ee2c8206b0"} Jan 27 11:30:36 crc kubenswrapper[4775]: I0127 11:30:36.610932 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"e21d1c3529b62f8826c3112a7e93fc08adf3dfaa1a5f37c6371a0eaf0b4f5a62"} Jan 27 11:30:36 crc kubenswrapper[4775]: I0127 11:30:36.610941 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"ba6b34b749ae1647219001268d83b10d62ccde5e3f7e6b5f6ffddd83de26566a"} Jan 27 11:30:36 crc kubenswrapper[4775]: I0127 11:30:36.610950 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"5be8408bc0028ddffd2077b92d63b4c221cfe55e52c960653307ec6b72b179bf"} Jan 27 11:30:39 crc kubenswrapper[4775]: I0127 11:30:39.639641 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"65098304a8a1a37d604c34217d9c290cfd3f2d1861a71136999dba5eb846c23b"} Jan 27 11:30:41 crc kubenswrapper[4775]: I0127 11:30:41.658332 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" event={"ID":"fae72616-e516-4ce6-86b8-b28f14a92939","Type":"ContainerStarted","Data":"af117489bcb760cd6b4bade688f366a50c03ae76050e2dd6d0f1b2c7b9c2a49a"} Jan 27 11:30:41 crc kubenswrapper[4775]: I0127 11:30:41.659573 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:41 crc kubenswrapper[4775]: I0127 11:30:41.659671 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:41 crc kubenswrapper[4775]: I0127 11:30:41.659762 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:41 crc kubenswrapper[4775]: I0127 11:30:41.695950 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" podStartSLOduration=7.695932513 podStartE2EDuration="7.695932513s" podCreationTimestamp="2026-01-27 11:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:30:41.691038584 +0000 UTC m=+620.832636451" watchObservedRunningTime="2026-01-27 11:30:41.695932513 +0000 UTC m=+620.837530300" Jan 27 11:30:41 crc kubenswrapper[4775]: I0127 11:30:41.706771 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:41 crc kubenswrapper[4775]: I0127 11:30:41.712740 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:30:48 crc kubenswrapper[4775]: I0127 11:30:48.745067 4775 scope.go:117] "RemoveContainer" containerID="bcc243e4b73c14109c2dd74058668508df08b94a8ab3ccb4e2fac0e77e263f09" Jan 27 11:30:48 crc kubenswrapper[4775]: E0127 11:30:48.745954 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-gm7w4_openshift-multus(aba2edc6-0e64-4995-830d-e177919ea13e)\"" pod="openshift-multus/multus-gm7w4" podUID="aba2edc6-0e64-4995-830d-e177919ea13e" Jan 27 11:30:59 crc kubenswrapper[4775]: I0127 11:30:59.745737 4775 scope.go:117] "RemoveContainer" containerID="bcc243e4b73c14109c2dd74058668508df08b94a8ab3ccb4e2fac0e77e263f09" Jan 27 11:31:00 crc kubenswrapper[4775]: I0127 11:31:00.807779 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gm7w4_aba2edc6-0e64-4995-830d-e177919ea13e/kube-multus/2.log" Jan 27 11:31:00 crc kubenswrapper[4775]: I0127 11:31:00.808445 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gm7w4" event={"ID":"aba2edc6-0e64-4995-830d-e177919ea13e","Type":"ContainerStarted","Data":"35d4395d9d7eb335c205003f382ddb1acd0f675feb8bdae5008fcf9452419a97"} Jan 27 11:31:05 crc kubenswrapper[4775]: I0127 11:31:05.018859 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.423200 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8"] Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.424675 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.427920 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.435490 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8"] Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.487716 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.487798 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.487877 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68vfq\" (UniqueName: \"kubernetes.io/projected/252d02e0-ca7d-405f-8315-3588f55a7b0c-kube-api-access-68vfq\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.589610 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.589664 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.589706 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68vfq\" (UniqueName: \"kubernetes.io/projected/252d02e0-ca7d-405f-8315-3588f55a7b0c-kube-api-access-68vfq\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.590490 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.590505 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.614933 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68vfq\" (UniqueName: \"kubernetes.io/projected/252d02e0-ca7d-405f-8315-3588f55a7b0c-kube-api-access-68vfq\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.742262 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:14 crc kubenswrapper[4775]: I0127 11:31:14.964543 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8"] Jan 27 11:31:14 crc kubenswrapper[4775]: W0127 11:31:14.971913 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod252d02e0_ca7d_405f_8315_3588f55a7b0c.slice/crio-1627ca7d80f4fbae983368c9ca389767a714a783645d92a33c725d70b38f8106 WatchSource:0}: Error finding container 1627ca7d80f4fbae983368c9ca389767a714a783645d92a33c725d70b38f8106: Status 404 returned error can't find the container with id 1627ca7d80f4fbae983368c9ca389767a714a783645d92a33c725d70b38f8106 Jan 27 11:31:15 crc kubenswrapper[4775]: I0127 11:31:15.917473 4775 generic.go:334] "Generic (PLEG): container finished" podID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerID="50003896a68c4b615e5518043da0851529315fd0b9f601839474320686b53035" exitCode=0 Jan 27 11:31:15 crc kubenswrapper[4775]: I0127 11:31:15.917735 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" event={"ID":"252d02e0-ca7d-405f-8315-3588f55a7b0c","Type":"ContainerDied","Data":"50003896a68c4b615e5518043da0851529315fd0b9f601839474320686b53035"} Jan 27 11:31:15 crc kubenswrapper[4775]: I0127 11:31:15.917762 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" event={"ID":"252d02e0-ca7d-405f-8315-3588f55a7b0c","Type":"ContainerStarted","Data":"1627ca7d80f4fbae983368c9ca389767a714a783645d92a33c725d70b38f8106"} Jan 27 11:31:17 crc kubenswrapper[4775]: I0127 11:31:17.929937 4775 generic.go:334] "Generic (PLEG): container finished" podID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerID="0e989fb340becd147e428c9537634b8d3fbdf14a94f26bb24eb8c7e194acbb5c" exitCode=0 Jan 27 11:31:17 crc kubenswrapper[4775]: I0127 11:31:17.930005 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" event={"ID":"252d02e0-ca7d-405f-8315-3588f55a7b0c","Type":"ContainerDied","Data":"0e989fb340becd147e428c9537634b8d3fbdf14a94f26bb24eb8c7e194acbb5c"} Jan 27 11:31:18 crc kubenswrapper[4775]: I0127 11:31:18.946117 4775 generic.go:334] "Generic (PLEG): container finished" podID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerID="886e6e6985b6672998c5a3f60f6e73d0df864b637b88600dbb8d779ea1634165" exitCode=0 Jan 27 11:31:18 crc kubenswrapper[4775]: I0127 11:31:18.946178 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" event={"ID":"252d02e0-ca7d-405f-8315-3588f55a7b0c","Type":"ContainerDied","Data":"886e6e6985b6672998c5a3f60f6e73d0df864b637b88600dbb8d779ea1634165"} Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.194275 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.265010 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68vfq\" (UniqueName: \"kubernetes.io/projected/252d02e0-ca7d-405f-8315-3588f55a7b0c-kube-api-access-68vfq\") pod \"252d02e0-ca7d-405f-8315-3588f55a7b0c\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.265182 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-util\") pod \"252d02e0-ca7d-405f-8315-3588f55a7b0c\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.265229 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-bundle\") pod \"252d02e0-ca7d-405f-8315-3588f55a7b0c\" (UID: \"252d02e0-ca7d-405f-8315-3588f55a7b0c\") " Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.265769 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-bundle" (OuterVolumeSpecName: "bundle") pod "252d02e0-ca7d-405f-8315-3588f55a7b0c" (UID: "252d02e0-ca7d-405f-8315-3588f55a7b0c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.271748 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/252d02e0-ca7d-405f-8315-3588f55a7b0c-kube-api-access-68vfq" (OuterVolumeSpecName: "kube-api-access-68vfq") pod "252d02e0-ca7d-405f-8315-3588f55a7b0c" (UID: "252d02e0-ca7d-405f-8315-3588f55a7b0c"). InnerVolumeSpecName "kube-api-access-68vfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.280923 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-util" (OuterVolumeSpecName: "util") pod "252d02e0-ca7d-405f-8315-3588f55a7b0c" (UID: "252d02e0-ca7d-405f-8315-3588f55a7b0c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.367054 4775 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-util\") on node \"crc\" DevicePath \"\"" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.367096 4775 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/252d02e0-ca7d-405f-8315-3588f55a7b0c-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.367106 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68vfq\" (UniqueName: \"kubernetes.io/projected/252d02e0-ca7d-405f-8315-3588f55a7b0c-kube-api-access-68vfq\") on node \"crc\" DevicePath \"\"" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.960388 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" event={"ID":"252d02e0-ca7d-405f-8315-3588f55a7b0c","Type":"ContainerDied","Data":"1627ca7d80f4fbae983368c9ca389767a714a783645d92a33c725d70b38f8106"} Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.960498 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1627ca7d80f4fbae983368c9ca389767a714a783645d92a33c725d70b38f8106" Jan 27 11:31:20 crc kubenswrapper[4775]: I0127 11:31:20.960418 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8" Jan 27 11:31:21 crc kubenswrapper[4775]: I0127 11:31:21.962925 4775 scope.go:117] "RemoveContainer" containerID="2f00ecf83985e5fb1f9fb6a6a5aca74d82fdf7c671bb6c064c9161ecae5e9c04" Jan 27 11:31:21 crc kubenswrapper[4775]: I0127 11:31:21.983311 4775 scope.go:117] "RemoveContainer" containerID="377794d7313a6c239a88c3f7885bbdcb7e5d521b0c3402cc7c0ba03a0eb30333" Jan 27 11:31:21 crc kubenswrapper[4775]: I0127 11:31:21.997099 4775 scope.go:117] "RemoveContainer" containerID="22b47f704f654f6f6768f1c089a5da41088598d2ce0a7667bee36d4253898693" Jan 27 11:31:22 crc kubenswrapper[4775]: I0127 11:31:22.012866 4775 scope.go:117] "RemoveContainer" containerID="627addc2c85e37362b52d3fb6c8bf68a28d4ccb4e89e4a9ec3b90a308bd74af5" Jan 27 11:31:22 crc kubenswrapper[4775]: I0127 11:31:22.028532 4775 scope.go:117] "RemoveContainer" containerID="491b1bd21a176e2c1cf30d3d886eeae01aaed27daf57848ae2febd6313e99500" Jan 27 11:31:22 crc kubenswrapper[4775]: I0127 11:31:22.045963 4775 scope.go:117] "RemoveContainer" containerID="109655ca6ffd54485a4225e75285723415f497107701a93b7eabd5ba19f9d8c5" Jan 27 11:31:22 crc kubenswrapper[4775]: I0127 11:31:22.061816 4775 scope.go:117] "RemoveContainer" containerID="fff264ae37c862c92f04505830404488875026a16f9b83753ca7e41d83f2d007" Jan 27 11:31:22 crc kubenswrapper[4775]: I0127 11:31:22.078829 4775 scope.go:117] "RemoveContainer" containerID="f1e6d54772ff94d2f7550c575e49644f4f416e66aeb5cf0ac93d1fe291d45072" Jan 27 11:31:22 crc kubenswrapper[4775]: I0127 11:31:22.097980 4775 scope.go:117] "RemoveContainer" containerID="46b0fa7b4f322cd8159c8b4884e8d19e7fb60878f02b6b171355ad20a713366e" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.194708 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-znzng"] Jan 27 11:31:23 crc kubenswrapper[4775]: E0127 11:31:23.195384 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerName="pull" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.195405 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerName="pull" Jan 27 11:31:23 crc kubenswrapper[4775]: E0127 11:31:23.195429 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerName="extract" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.195439 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerName="extract" Jan 27 11:31:23 crc kubenswrapper[4775]: E0127 11:31:23.195473 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerName="util" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.195485 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerName="util" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.195649 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="252d02e0-ca7d-405f-8315-3588f55a7b0c" containerName="extract" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.196186 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-znzng" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.198224 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-rtnxx" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.198236 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.198746 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.203575 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-znzng"] Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.302155 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brvrv\" (UniqueName: \"kubernetes.io/projected/cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f-kube-api-access-brvrv\") pod \"nmstate-operator-646758c888-znzng\" (UID: \"cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f\") " pod="openshift-nmstate/nmstate-operator-646758c888-znzng" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.403499 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brvrv\" (UniqueName: \"kubernetes.io/projected/cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f-kube-api-access-brvrv\") pod \"nmstate-operator-646758c888-znzng\" (UID: \"cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f\") " pod="openshift-nmstate/nmstate-operator-646758c888-znzng" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.429515 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brvrv\" (UniqueName: \"kubernetes.io/projected/cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f-kube-api-access-brvrv\") pod \"nmstate-operator-646758c888-znzng\" (UID: \"cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f\") " pod="openshift-nmstate/nmstate-operator-646758c888-znzng" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.510064 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-znzng" Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.764742 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-znzng"] Jan 27 11:31:23 crc kubenswrapper[4775]: I0127 11:31:23.978520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-znzng" event={"ID":"cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f","Type":"ContainerStarted","Data":"1cee26aecbd228ef746e224e9ea667077798e9b9561d4134fa1e12188cc3fc89"} Jan 27 11:31:27 crc kubenswrapper[4775]: I0127 11:31:27.007754 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-znzng" event={"ID":"cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f","Type":"ContainerStarted","Data":"4695d30628f6cd34b24844835fb98d2b4f85c2ef36ce5e32dc9807eb433f189c"} Jan 27 11:31:27 crc kubenswrapper[4775]: I0127 11:31:27.033091 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-znzng" podStartSLOduration=1.553827802 podStartE2EDuration="4.033050328s" podCreationTimestamp="2026-01-27 11:31:23 +0000 UTC" firstStartedPulling="2026-01-27 11:31:23.769095739 +0000 UTC m=+662.910693526" lastFinishedPulling="2026-01-27 11:31:26.248318275 +0000 UTC m=+665.389916052" observedRunningTime="2026-01-27 11:31:27.028908486 +0000 UTC m=+666.170506293" watchObservedRunningTime="2026-01-27 11:31:27.033050328 +0000 UTC m=+666.174648125" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.025784 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-2qhwx"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.026799 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.032259 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-p9dzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.040489 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.041327 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.048486 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.048910 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-2qhwx"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.072335 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.101211 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-4vtwf"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.101990 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.160895 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7vd6\" (UniqueName: \"kubernetes.io/projected/4c84a5ec-b41d-4396-adea-3c9964cc7c59-kube-api-access-b7vd6\") pod \"nmstate-metrics-54757c584b-2qhwx\" (UID: \"4c84a5ec-b41d-4396-adea-3c9964cc7c59\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.161063 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg4m2\" (UniqueName: \"kubernetes.io/projected/d9f9feec-ee04-44de-8879-4071243ac6db-kube-api-access-fg4m2\") pod \"nmstate-webhook-8474b5b9d8-d9lzh\" (UID: \"d9f9feec-ee04-44de-8879-4071243ac6db\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.161205 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d9f9feec-ee04-44de-8879-4071243ac6db-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-d9lzh\" (UID: \"d9f9feec-ee04-44de-8879-4071243ac6db\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.177878 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.180248 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.185882 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.185911 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-4kp2r" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.185951 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.205848 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.262604 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d9f9feec-ee04-44de-8879-4071243ac6db-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-d9lzh\" (UID: \"d9f9feec-ee04-44de-8879-4071243ac6db\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.262715 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7vd6\" (UniqueName: \"kubernetes.io/projected/4c84a5ec-b41d-4396-adea-3c9964cc7c59-kube-api-access-b7vd6\") pod \"nmstate-metrics-54757c584b-2qhwx\" (UID: \"4c84a5ec-b41d-4396-adea-3c9964cc7c59\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.262777 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-ovs-socket\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: E0127 11:31:28.262773 4775 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.262859 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-nmstate-lock\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.262917 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-dbus-socket\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: E0127 11:31:28.262953 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9f9feec-ee04-44de-8879-4071243ac6db-tls-key-pair podName:d9f9feec-ee04-44de-8879-4071243ac6db nodeName:}" failed. No retries permitted until 2026-01-27 11:31:28.762917511 +0000 UTC m=+667.904515308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/d9f9feec-ee04-44de-8879-4071243ac6db-tls-key-pair") pod "nmstate-webhook-8474b5b9d8-d9lzh" (UID: "d9f9feec-ee04-44de-8879-4071243ac6db") : secret "openshift-nmstate-webhook" not found Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.262998 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc8d4\" (UniqueName: \"kubernetes.io/projected/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-kube-api-access-dc8d4\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.263122 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/76d9c92d-c012-448b-8ff5-00f10c17c5a7-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.263170 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/76d9c92d-c012-448b-8ff5-00f10c17c5a7-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.263254 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fg4m2\" (UniqueName: \"kubernetes.io/projected/d9f9feec-ee04-44de-8879-4071243ac6db-kube-api-access-fg4m2\") pod \"nmstate-webhook-8474b5b9d8-d9lzh\" (UID: \"d9f9feec-ee04-44de-8879-4071243ac6db\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.263300 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sx6kw\" (UniqueName: \"kubernetes.io/projected/76d9c92d-c012-448b-8ff5-00f10c17c5a7-kube-api-access-sx6kw\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.283201 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7vd6\" (UniqueName: \"kubernetes.io/projected/4c84a5ec-b41d-4396-adea-3c9964cc7c59-kube-api-access-b7vd6\") pod \"nmstate-metrics-54757c584b-2qhwx\" (UID: \"4c84a5ec-b41d-4396-adea-3c9964cc7c59\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.283526 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg4m2\" (UniqueName: \"kubernetes.io/projected/d9f9feec-ee04-44de-8879-4071243ac6db-kube-api-access-fg4m2\") pod \"nmstate-webhook-8474b5b9d8-d9lzh\" (UID: \"d9f9feec-ee04-44de-8879-4071243ac6db\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.348360 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.367264 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/76d9c92d-c012-448b-8ff5-00f10c17c5a7-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.367320 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/76d9c92d-c012-448b-8ff5-00f10c17c5a7-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.367352 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sx6kw\" (UniqueName: \"kubernetes.io/projected/76d9c92d-c012-448b-8ff5-00f10c17c5a7-kube-api-access-sx6kw\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.367415 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-ovs-socket\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.367470 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-nmstate-lock\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.367500 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-dbus-socket\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.367532 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8d4\" (UniqueName: \"kubernetes.io/projected/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-kube-api-access-dc8d4\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.368880 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/76d9c92d-c012-448b-8ff5-00f10c17c5a7-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.369556 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-ovs-socket\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.370074 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-nmstate-lock\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.370489 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-dbus-socket\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.375267 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/76d9c92d-c012-448b-8ff5-00f10c17c5a7-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.389281 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-84c44595ff-qwwqd"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.390106 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.390300 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sx6kw\" (UniqueName: \"kubernetes.io/projected/76d9c92d-c012-448b-8ff5-00f10c17c5a7-kube-api-access-sx6kw\") pod \"nmstate-console-plugin-7754f76f8b-tm9vw\" (UID: \"76d9c92d-c012-448b-8ff5-00f10c17c5a7\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.406756 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84c44595ff-qwwqd"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.411113 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc8d4\" (UniqueName: \"kubernetes.io/projected/0aa6cbcb-077f-4ae7-85b2-d79679ef64df-kube-api-access-dc8d4\") pod \"nmstate-handler-4vtwf\" (UID: \"0aa6cbcb-077f-4ae7-85b2-d79679ef64df\") " pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.415751 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:28 crc kubenswrapper[4775]: W0127 11:31:28.447836 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0aa6cbcb_077f_4ae7_85b2_d79679ef64df.slice/crio-69207126e988dcbb5447577a61d1154f7c4bf8835e6961b5edc47e5c6a4beec4 WatchSource:0}: Error finding container 69207126e988dcbb5447577a61d1154f7c4bf8835e6961b5edc47e5c6a4beec4: Status 404 returned error can't find the container with id 69207126e988dcbb5447577a61d1154f7c4bf8835e6961b5edc47e5c6a4beec4 Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.468543 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-oauth-config\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.468584 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-service-ca\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.468644 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-trusted-ca-bundle\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.468887 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-oauth-serving-cert\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.468941 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfnrj\" (UniqueName: \"kubernetes.io/projected/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-kube-api-access-rfnrj\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.469015 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-serving-cert\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.469051 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-config\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.504001 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.570668 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-oauth-serving-cert\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.571024 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfnrj\" (UniqueName: \"kubernetes.io/projected/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-kube-api-access-rfnrj\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.571057 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-serving-cert\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.571078 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-config\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.571125 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-oauth-config\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.571147 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-service-ca\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.571208 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-trusted-ca-bundle\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.571478 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-oauth-serving-cert\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.572141 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-config\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.572553 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-trusted-ca-bundle\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.572778 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-service-ca\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.576464 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-serving-cert\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.577383 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-console-oauth-config\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.589871 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfnrj\" (UniqueName: \"kubernetes.io/projected/bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7-kube-api-access-rfnrj\") pod \"console-84c44595ff-qwwqd\" (UID: \"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7\") " pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.675600 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw"] Jan 27 11:31:28 crc kubenswrapper[4775]: W0127 11:31:28.676813 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76d9c92d_c012_448b_8ff5_00f10c17c5a7.slice/crio-a867ef9e379e9817fcafffd3f0587d5790800a2b0aa98b4aab073b76c15032e2 WatchSource:0}: Error finding container a867ef9e379e9817fcafffd3f0587d5790800a2b0aa98b4aab073b76c15032e2: Status 404 returned error can't find the container with id a867ef9e379e9817fcafffd3f0587d5790800a2b0aa98b4aab073b76c15032e2 Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.737993 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.764741 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-2qhwx"] Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.773259 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d9f9feec-ee04-44de-8879-4071243ac6db-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-d9lzh\" (UID: \"d9f9feec-ee04-44de-8879-4071243ac6db\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.781761 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/d9f9feec-ee04-44de-8879-4071243ac6db-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-d9lzh\" (UID: \"d9f9feec-ee04-44de-8879-4071243ac6db\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.954188 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84c44595ff-qwwqd"] Jan 27 11:31:28 crc kubenswrapper[4775]: W0127 11:31:28.957734 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfe7bd0c_4325_4a4a_a4e7_fb0d4c990bb7.slice/crio-e16ba07e6f32343049bdd0a8105617df715e989a7707c601e595125f6a36f699 WatchSource:0}: Error finding container e16ba07e6f32343049bdd0a8105617df715e989a7707c601e595125f6a36f699: Status 404 returned error can't find the container with id e16ba07e6f32343049bdd0a8105617df715e989a7707c601e595125f6a36f699 Jan 27 11:31:28 crc kubenswrapper[4775]: I0127 11:31:28.964820 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:29 crc kubenswrapper[4775]: I0127 11:31:29.025648 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84c44595ff-qwwqd" event={"ID":"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7","Type":"ContainerStarted","Data":"e16ba07e6f32343049bdd0a8105617df715e989a7707c601e595125f6a36f699"} Jan 27 11:31:29 crc kubenswrapper[4775]: I0127 11:31:29.027000 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" event={"ID":"4c84a5ec-b41d-4396-adea-3c9964cc7c59","Type":"ContainerStarted","Data":"982053ce13c449a8452182194cbe607fec59cc559be6e06ac08826aa661adc64"} Jan 27 11:31:29 crc kubenswrapper[4775]: I0127 11:31:29.028372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" event={"ID":"76d9c92d-c012-448b-8ff5-00f10c17c5a7","Type":"ContainerStarted","Data":"a867ef9e379e9817fcafffd3f0587d5790800a2b0aa98b4aab073b76c15032e2"} Jan 27 11:31:29 crc kubenswrapper[4775]: I0127 11:31:29.029779 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-4vtwf" event={"ID":"0aa6cbcb-077f-4ae7-85b2-d79679ef64df","Type":"ContainerStarted","Data":"69207126e988dcbb5447577a61d1154f7c4bf8835e6961b5edc47e5c6a4beec4"} Jan 27 11:31:29 crc kubenswrapper[4775]: I0127 11:31:29.150127 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh"] Jan 27 11:31:29 crc kubenswrapper[4775]: W0127 11:31:29.164013 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9f9feec_ee04_44de_8879_4071243ac6db.slice/crio-7b03fbae930bf1a9c0d11088644d29d93cdd5b84f167e1c9abc5f3927c451fc3 WatchSource:0}: Error finding container 7b03fbae930bf1a9c0d11088644d29d93cdd5b84f167e1c9abc5f3927c451fc3: Status 404 returned error can't find the container with id 7b03fbae930bf1a9c0d11088644d29d93cdd5b84f167e1c9abc5f3927c451fc3 Jan 27 11:31:30 crc kubenswrapper[4775]: I0127 11:31:30.037761 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" event={"ID":"d9f9feec-ee04-44de-8879-4071243ac6db","Type":"ContainerStarted","Data":"7b03fbae930bf1a9c0d11088644d29d93cdd5b84f167e1c9abc5f3927c451fc3"} Jan 27 11:31:30 crc kubenswrapper[4775]: I0127 11:31:30.039303 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84c44595ff-qwwqd" event={"ID":"bfe7bd0c-4325-4a4a-a4e7-fb0d4c990bb7","Type":"ContainerStarted","Data":"c7c2c326647ef2564d34ef361d65662db23fcdeda4d850dd5a72b45fd4f7e386"} Jan 27 11:31:30 crc kubenswrapper[4775]: I0127 11:31:30.056699 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84c44595ff-qwwqd" podStartSLOduration=2.056681216 podStartE2EDuration="2.056681216s" podCreationTimestamp="2026-01-27 11:31:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:31:30.054897367 +0000 UTC m=+669.196495144" watchObservedRunningTime="2026-01-27 11:31:30.056681216 +0000 UTC m=+669.198278983" Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.051432 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" event={"ID":"4c84a5ec-b41d-4396-adea-3c9964cc7c59","Type":"ContainerStarted","Data":"c9aad9050ef97adc11acc62a5017d1bbad96d5367d230f0ce862f05c7fb52775"} Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.054115 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" event={"ID":"d9f9feec-ee04-44de-8879-4071243ac6db","Type":"ContainerStarted","Data":"ad20f0aecba19368384b3c78e7bc1a28d738c995c83b1c9f0fe81147c8d01b56"} Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.054426 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.056306 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" event={"ID":"76d9c92d-c012-448b-8ff5-00f10c17c5a7","Type":"ContainerStarted","Data":"f7c4c875e34078949e7bb169f7b0ed5babe168d72546d17e1821c4a1958894fc"} Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.057903 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-4vtwf" event={"ID":"0aa6cbcb-077f-4ae7-85b2-d79679ef64df","Type":"ContainerStarted","Data":"a53f78ac6c47508c2dfea94e941a23e71f0ca9c499b2776a3f7d624db5b2737d"} Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.058047 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.074475 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" podStartSLOduration=2.089719766 podStartE2EDuration="4.074436708s" podCreationTimestamp="2026-01-27 11:31:28 +0000 UTC" firstStartedPulling="2026-01-27 11:31:29.167891359 +0000 UTC m=+668.309489136" lastFinishedPulling="2026-01-27 11:31:31.152608301 +0000 UTC m=+670.294206078" observedRunningTime="2026-01-27 11:31:32.069207966 +0000 UTC m=+671.210805763" watchObservedRunningTime="2026-01-27 11:31:32.074436708 +0000 UTC m=+671.216034495" Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.088993 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-4vtwf" podStartSLOduration=1.405721123 podStartE2EDuration="4.088975464s" podCreationTimestamp="2026-01-27 11:31:28 +0000 UTC" firstStartedPulling="2026-01-27 11:31:28.45020746 +0000 UTC m=+667.591805237" lastFinishedPulling="2026-01-27 11:31:31.133461801 +0000 UTC m=+670.275059578" observedRunningTime="2026-01-27 11:31:32.088244094 +0000 UTC m=+671.229841911" watchObservedRunningTime="2026-01-27 11:31:32.088975464 +0000 UTC m=+671.230573261" Jan 27 11:31:32 crc kubenswrapper[4775]: I0127 11:31:32.119823 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-tm9vw" podStartSLOduration=1.662898175 podStartE2EDuration="4.119805773s" podCreationTimestamp="2026-01-27 11:31:28 +0000 UTC" firstStartedPulling="2026-01-27 11:31:28.678972898 +0000 UTC m=+667.820570675" lastFinishedPulling="2026-01-27 11:31:31.135880496 +0000 UTC m=+670.277478273" observedRunningTime="2026-01-27 11:31:32.116243067 +0000 UTC m=+671.257840854" watchObservedRunningTime="2026-01-27 11:31:32.119805773 +0000 UTC m=+671.261403550" Jan 27 11:31:34 crc kubenswrapper[4775]: I0127 11:31:34.069800 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" event={"ID":"4c84a5ec-b41d-4396-adea-3c9964cc7c59","Type":"ContainerStarted","Data":"d4823ca5709ba44722036bfeb628dd796de39c293301d0ccbfb15c732c75c316"} Jan 27 11:31:34 crc kubenswrapper[4775]: I0127 11:31:34.092251 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-2qhwx" podStartSLOduration=1.331951986 podStartE2EDuration="6.092226272s" podCreationTimestamp="2026-01-27 11:31:28 +0000 UTC" firstStartedPulling="2026-01-27 11:31:28.771891618 +0000 UTC m=+667.913489395" lastFinishedPulling="2026-01-27 11:31:33.532165864 +0000 UTC m=+672.673763681" observedRunningTime="2026-01-27 11:31:34.082939189 +0000 UTC m=+673.224537006" watchObservedRunningTime="2026-01-27 11:31:34.092226272 +0000 UTC m=+673.233824079" Jan 27 11:31:38 crc kubenswrapper[4775]: I0127 11:31:38.438174 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-4vtwf" Jan 27 11:31:38 crc kubenswrapper[4775]: I0127 11:31:38.738239 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:38 crc kubenswrapper[4775]: I0127 11:31:38.738313 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:38 crc kubenswrapper[4775]: I0127 11:31:38.745866 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:39 crc kubenswrapper[4775]: I0127 11:31:39.107752 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84c44595ff-qwwqd" Jan 27 11:31:39 crc kubenswrapper[4775]: I0127 11:31:39.191004 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hj8rf"] Jan 27 11:31:48 crc kubenswrapper[4775]: I0127 11:31:48.973068 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-d9lzh" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.078865 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk"] Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.081153 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.083895 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.097281 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk"] Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.179490 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgrtw\" (UniqueName: \"kubernetes.io/projected/99ed53a2-63f4-4636-b581-2a686d44d5d0-kube-api-access-cgrtw\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.179586 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.179756 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.280698 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.280827 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.280873 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgrtw\" (UniqueName: \"kubernetes.io/projected/99ed53a2-63f4-4636-b581-2a686d44d5d0-kube-api-access-cgrtw\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.281321 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.281569 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.306018 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgrtw\" (UniqueName: \"kubernetes.io/projected/99ed53a2-63f4-4636-b581-2a686d44d5d0-kube-api-access-cgrtw\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.405078 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:03 crc kubenswrapper[4775]: I0127 11:32:03.689248 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk"] Jan 27 11:32:03 crc kubenswrapper[4775]: W0127 11:32:03.700920 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99ed53a2_63f4_4636_b581_2a686d44d5d0.slice/crio-76d9e1c3a6eff2eb78dfc35e084b4108b30c37b36899e819bba9d91cd7762846 WatchSource:0}: Error finding container 76d9e1c3a6eff2eb78dfc35e084b4108b30c37b36899e819bba9d91cd7762846: Status 404 returned error can't find the container with id 76d9e1c3a6eff2eb78dfc35e084b4108b30c37b36899e819bba9d91cd7762846 Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.264313 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-hj8rf" podUID="ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" containerName="console" containerID="cri-o://94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a" gracePeriod=15 Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.273527 4775 generic.go:334] "Generic (PLEG): container finished" podID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerID="b3c1106a62535249c344a2ba38dd2af2783f4df62b77cd7cef2cb50afd049328" exitCode=0 Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.273579 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" event={"ID":"99ed53a2-63f4-4636-b581-2a686d44d5d0","Type":"ContainerDied","Data":"b3c1106a62535249c344a2ba38dd2af2783f4df62b77cd7cef2cb50afd049328"} Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.273613 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" event={"ID":"99ed53a2-63f4-4636-b581-2a686d44d5d0","Type":"ContainerStarted","Data":"76d9e1c3a6eff2eb78dfc35e084b4108b30c37b36899e819bba9d91cd7762846"} Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.621388 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hj8rf_ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf/console/0.log" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.621492 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.697596 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-config\") pod \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.697650 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvbnl\" (UniqueName: \"kubernetes.io/projected/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-kube-api-access-qvbnl\") pod \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.697682 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-oauth-serving-cert\") pod \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.697720 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-trusted-ca-bundle\") pod \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.697757 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-oauth-config\") pod \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.697781 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-service-ca\") pod \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.697834 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-serving-cert\") pod \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\" (UID: \"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf\") " Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.698808 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-config" (OuterVolumeSpecName: "console-config") pod "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" (UID: "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.698827 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" (UID: "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.698837 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" (UID: "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.698913 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-service-ca" (OuterVolumeSpecName: "service-ca") pod "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" (UID: "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.704102 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" (UID: "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.704180 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" (UID: "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.704495 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-kube-api-access-qvbnl" (OuterVolumeSpecName: "kube-api-access-qvbnl") pod "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" (UID: "ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf"). InnerVolumeSpecName "kube-api-access-qvbnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.799429 4775 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.799504 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvbnl\" (UniqueName: \"kubernetes.io/projected/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-kube-api-access-qvbnl\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.799522 4775 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.799534 4775 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.799545 4775 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.799557 4775 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:04 crc kubenswrapper[4775]: I0127 11:32:04.799596 4775 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.284883 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hj8rf_ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf/console/0.log" Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.285883 4775 generic.go:334] "Generic (PLEG): container finished" podID="ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" containerID="94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a" exitCode=2 Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.285992 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hj8rf" event={"ID":"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf","Type":"ContainerDied","Data":"94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a"} Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.286323 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hj8rf" event={"ID":"ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf","Type":"ContainerDied","Data":"152d04ae80ec3e4ea65562160c2d55c0e2688c495a74f3bc1b1fca916b3879fa"} Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.286025 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hj8rf" Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.286390 4775 scope.go:117] "RemoveContainer" containerID="94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a" Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.314198 4775 scope.go:117] "RemoveContainer" containerID="94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a" Jan 27 11:32:05 crc kubenswrapper[4775]: E0127 11:32:05.314994 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a\": container with ID starting with 94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a not found: ID does not exist" containerID="94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a" Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.315109 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a"} err="failed to get container status \"94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a\": rpc error: code = NotFound desc = could not find container \"94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a\": container with ID starting with 94b6380b13b3881afb17f347708c2db2998fbbc605a638343b0c1f8a8851455a not found: ID does not exist" Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.332572 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hj8rf"] Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.336503 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-hj8rf"] Jan 27 11:32:05 crc kubenswrapper[4775]: I0127 11:32:05.750840 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" path="/var/lib/kubelet/pods/ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf/volumes" Jan 27 11:32:06 crc kubenswrapper[4775]: I0127 11:32:06.294260 4775 generic.go:334] "Generic (PLEG): container finished" podID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerID="f69691451111e14034d367b659566622a72253576f1d06403550dc4371afa6fa" exitCode=0 Jan 27 11:32:06 crc kubenswrapper[4775]: I0127 11:32:06.294305 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" event={"ID":"99ed53a2-63f4-4636-b581-2a686d44d5d0","Type":"ContainerDied","Data":"f69691451111e14034d367b659566622a72253576f1d06403550dc4371afa6fa"} Jan 27 11:32:07 crc kubenswrapper[4775]: I0127 11:32:07.317801 4775 generic.go:334] "Generic (PLEG): container finished" podID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerID="339bbc62a1aae1c202c0dc66cd40de2f500ad662feae5351cd8fac675c93837e" exitCode=0 Jan 27 11:32:07 crc kubenswrapper[4775]: I0127 11:32:07.317899 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" event={"ID":"99ed53a2-63f4-4636-b581-2a686d44d5d0","Type":"ContainerDied","Data":"339bbc62a1aae1c202c0dc66cd40de2f500ad662feae5351cd8fac675c93837e"} Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.626215 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.758244 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgrtw\" (UniqueName: \"kubernetes.io/projected/99ed53a2-63f4-4636-b581-2a686d44d5d0-kube-api-access-cgrtw\") pod \"99ed53a2-63f4-4636-b581-2a686d44d5d0\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.758319 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-bundle\") pod \"99ed53a2-63f4-4636-b581-2a686d44d5d0\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.758338 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-util\") pod \"99ed53a2-63f4-4636-b581-2a686d44d5d0\" (UID: \"99ed53a2-63f4-4636-b581-2a686d44d5d0\") " Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.760504 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-bundle" (OuterVolumeSpecName: "bundle") pod "99ed53a2-63f4-4636-b581-2a686d44d5d0" (UID: "99ed53a2-63f4-4636-b581-2a686d44d5d0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.764931 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99ed53a2-63f4-4636-b581-2a686d44d5d0-kube-api-access-cgrtw" (OuterVolumeSpecName: "kube-api-access-cgrtw") pod "99ed53a2-63f4-4636-b581-2a686d44d5d0" (UID: "99ed53a2-63f4-4636-b581-2a686d44d5d0"). InnerVolumeSpecName "kube-api-access-cgrtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.772033 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-util" (OuterVolumeSpecName: "util") pod "99ed53a2-63f4-4636-b581-2a686d44d5d0" (UID: "99ed53a2-63f4-4636-b581-2a686d44d5d0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.860333 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgrtw\" (UniqueName: \"kubernetes.io/projected/99ed53a2-63f4-4636-b581-2a686d44d5d0-kube-api-access-cgrtw\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.860386 4775 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-util\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:08 crc kubenswrapper[4775]: I0127 11:32:08.860446 4775 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99ed53a2-63f4-4636-b581-2a686d44d5d0-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:32:09 crc kubenswrapper[4775]: I0127 11:32:09.335583 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" event={"ID":"99ed53a2-63f4-4636-b581-2a686d44d5d0","Type":"ContainerDied","Data":"76d9e1c3a6eff2eb78dfc35e084b4108b30c37b36899e819bba9d91cd7762846"} Jan 27 11:32:09 crc kubenswrapper[4775]: I0127 11:32:09.335916 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76d9e1c3a6eff2eb78dfc35e084b4108b30c37b36899e819bba9d91cd7762846" Jan 27 11:32:09 crc kubenswrapper[4775]: I0127 11:32:09.335933 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.150685 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74"] Jan 27 11:32:18 crc kubenswrapper[4775]: E0127 11:32:18.151501 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerName="extract" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.151516 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerName="extract" Jan 27 11:32:18 crc kubenswrapper[4775]: E0127 11:32:18.151525 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerName="pull" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.151532 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerName="pull" Jan 27 11:32:18 crc kubenswrapper[4775]: E0127 11:32:18.151546 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" containerName="console" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.151554 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" containerName="console" Jan 27 11:32:18 crc kubenswrapper[4775]: E0127 11:32:18.151568 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerName="util" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.151576 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerName="util" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.151687 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff82a739-2cf1-4c0f-b80f-9aad9ae7fddf" containerName="console" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.151700 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="99ed53a2-63f4-4636-b581-2a686d44d5d0" containerName="extract" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.152153 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.155511 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.155782 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fptrf" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.155915 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.155973 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.166386 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.223009 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74"] Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.285165 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g568\" (UniqueName: \"kubernetes.io/projected/7560029a-575e-4d87-b4e8-4f090c5a7cd9-kube-api-access-7g568\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.285421 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7560029a-575e-4d87-b4e8-4f090c5a7cd9-webhook-cert\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.285505 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7560029a-575e-4d87-b4e8-4f090c5a7cd9-apiservice-cert\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.386604 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g568\" (UniqueName: \"kubernetes.io/projected/7560029a-575e-4d87-b4e8-4f090c5a7cd9-kube-api-access-7g568\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.386658 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7560029a-575e-4d87-b4e8-4f090c5a7cd9-webhook-cert\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.386699 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7560029a-575e-4d87-b4e8-4f090c5a7cd9-apiservice-cert\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.392280 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7560029a-575e-4d87-b4e8-4f090c5a7cd9-apiservice-cert\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.392349 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7560029a-575e-4d87-b4e8-4f090c5a7cd9-webhook-cert\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.406490 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g568\" (UniqueName: \"kubernetes.io/projected/7560029a-575e-4d87-b4e8-4f090c5a7cd9-kube-api-access-7g568\") pod \"metallb-operator-controller-manager-7c8c7fc46c-g7l74\" (UID: \"7560029a-575e-4d87-b4e8-4f090c5a7cd9\") " pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.469579 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.490615 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966"] Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.491473 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.494754 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.494897 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.495176 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-q86kf" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.517634 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966"] Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.624630 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/acb19b04-4cd3-4304-a572-d25d4aa2932b-apiservice-cert\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.624998 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/acb19b04-4cd3-4304-a572-d25d4aa2932b-webhook-cert\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.625172 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swgk9\" (UniqueName: \"kubernetes.io/projected/acb19b04-4cd3-4304-a572-d25d4aa2932b-kube-api-access-swgk9\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.726876 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swgk9\" (UniqueName: \"kubernetes.io/projected/acb19b04-4cd3-4304-a572-d25d4aa2932b-kube-api-access-swgk9\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.726964 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/acb19b04-4cd3-4304-a572-d25d4aa2932b-apiservice-cert\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.726987 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/acb19b04-4cd3-4304-a572-d25d4aa2932b-webhook-cert\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.735550 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/acb19b04-4cd3-4304-a572-d25d4aa2932b-apiservice-cert\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.756262 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swgk9\" (UniqueName: \"kubernetes.io/projected/acb19b04-4cd3-4304-a572-d25d4aa2932b-kube-api-access-swgk9\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.758323 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/acb19b04-4cd3-4304-a572-d25d4aa2932b-webhook-cert\") pod \"metallb-operator-webhook-server-6b85bfbbbb-bb966\" (UID: \"acb19b04-4cd3-4304-a572-d25d4aa2932b\") " pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.780808 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74"] Jan 27 11:32:18 crc kubenswrapper[4775]: I0127 11:32:18.846023 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:19 crc kubenswrapper[4775]: I0127 11:32:19.078666 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966"] Jan 27 11:32:19 crc kubenswrapper[4775]: W0127 11:32:19.087411 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacb19b04_4cd3_4304_a572_d25d4aa2932b.slice/crio-34e54cae0282d1302a27c32f940adadd39a371221d672332b3caedbdda89e9b2 WatchSource:0}: Error finding container 34e54cae0282d1302a27c32f940adadd39a371221d672332b3caedbdda89e9b2: Status 404 returned error can't find the container with id 34e54cae0282d1302a27c32f940adadd39a371221d672332b3caedbdda89e9b2 Jan 27 11:32:19 crc kubenswrapper[4775]: I0127 11:32:19.388895 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" event={"ID":"acb19b04-4cd3-4304-a572-d25d4aa2932b","Type":"ContainerStarted","Data":"34e54cae0282d1302a27c32f940adadd39a371221d672332b3caedbdda89e9b2"} Jan 27 11:32:19 crc kubenswrapper[4775]: I0127 11:32:19.389881 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" event={"ID":"7560029a-575e-4d87-b4e8-4f090c5a7cd9","Type":"ContainerStarted","Data":"72f6506066b09282f04c67a68caf2788d1fc79b1526106321ff8d1003ba93f77"} Jan 27 11:32:24 crc kubenswrapper[4775]: I0127 11:32:24.435662 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" event={"ID":"7560029a-575e-4d87-b4e8-4f090c5a7cd9","Type":"ContainerStarted","Data":"adad9e29af6591f758ea555c02105819fdbdf11cb0d16c7b3575edbf92d4167f"} Jan 27 11:32:24 crc kubenswrapper[4775]: I0127 11:32:24.437162 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:24 crc kubenswrapper[4775]: I0127 11:32:24.445153 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" event={"ID":"acb19b04-4cd3-4304-a572-d25d4aa2932b","Type":"ContainerStarted","Data":"85eae189b415d15a57f4e923c259603f2faba9a718a1f3dc47c94c2d133dd878"} Jan 27 11:32:24 crc kubenswrapper[4775]: I0127 11:32:24.445908 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:24 crc kubenswrapper[4775]: I0127 11:32:24.466720 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" podStartSLOduration=1.86886559 podStartE2EDuration="6.466697738s" podCreationTimestamp="2026-01-27 11:32:18 +0000 UTC" firstStartedPulling="2026-01-27 11:32:18.789994365 +0000 UTC m=+717.931592142" lastFinishedPulling="2026-01-27 11:32:23.387826513 +0000 UTC m=+722.529424290" observedRunningTime="2026-01-27 11:32:24.461756393 +0000 UTC m=+723.603354170" watchObservedRunningTime="2026-01-27 11:32:24.466697738 +0000 UTC m=+723.608295515" Jan 27 11:32:29 crc kubenswrapper[4775]: I0127 11:32:29.518260 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:32:29 crc kubenswrapper[4775]: I0127 11:32:29.518604 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:32:38 crc kubenswrapper[4775]: I0127 11:32:38.876274 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" Jan 27 11:32:38 crc kubenswrapper[4775]: I0127 11:32:38.899374 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6b85bfbbbb-bb966" podStartSLOduration=16.527767836 podStartE2EDuration="20.899352722s" podCreationTimestamp="2026-01-27 11:32:18 +0000 UTC" firstStartedPulling="2026-01-27 11:32:19.091303192 +0000 UTC m=+718.232900969" lastFinishedPulling="2026-01-27 11:32:23.462888088 +0000 UTC m=+722.604485855" observedRunningTime="2026-01-27 11:32:24.479373783 +0000 UTC m=+723.620971560" watchObservedRunningTime="2026-01-27 11:32:38.899352722 +0000 UTC m=+738.040950499" Jan 27 11:32:58 crc kubenswrapper[4775]: I0127 11:32:58.472290 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7c8c7fc46c-g7l74" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.214953 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz"] Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.215940 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.219373 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-tbqt8" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.227817 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-52txr"] Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.230027 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.230061 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.233924 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.234137 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.234614 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz"] Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.286582 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-qm9dq"] Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.287403 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.289066 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wdvrg" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.289588 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.289727 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.289731 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.308030 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-4tjsf"] Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.309043 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.310396 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.312700 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-4tjsf"] Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351603 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-metrics-certs\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351644 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-sockets\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351675 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5573a041-6f7e-4c23-b2ea-42de01c96cdd-metallb-excludel2\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351698 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-startup\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351721 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-ht6jz\" (UID: \"de8a1d9c-9c8b-4200-92ae-b82c65b24d56\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351738 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swgq6\" (UniqueName: \"kubernetes.io/projected/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-kube-api-access-swgq6\") pod \"frr-k8s-webhook-server-7df86c4f6c-ht6jz\" (UID: \"de8a1d9c-9c8b-4200-92ae-b82c65b24d56\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351761 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz8mm\" (UniqueName: \"kubernetes.io/projected/5573a041-6f7e-4c23-b2ea-42de01c96cdd-kube-api-access-qz8mm\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351778 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-reloader\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.351937 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.352025 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-conf\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.352077 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics-certs\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.352132 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.352164 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv2hg\" (UniqueName: \"kubernetes.io/projected/ac3b8043-04c7-4036-9dc5-6068d914356c-kube-api-access-fv2hg\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453029 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv2hg\" (UniqueName: \"kubernetes.io/projected/ac3b8043-04c7-4036-9dc5-6068d914356c-kube-api-access-fv2hg\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453085 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-metrics-certs\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453104 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-sockets\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453131 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5573a041-6f7e-4c23-b2ea-42de01c96cdd-metallb-excludel2\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453162 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-cert\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453180 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-startup\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453199 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnmwc\" (UniqueName: \"kubernetes.io/projected/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-kube-api-access-cnmwc\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453218 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-ht6jz\" (UID: \"de8a1d9c-9c8b-4200-92ae-b82c65b24d56\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453232 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swgq6\" (UniqueName: \"kubernetes.io/projected/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-kube-api-access-swgq6\") pod \"frr-k8s-webhook-server-7df86c4f6c-ht6jz\" (UID: \"de8a1d9c-9c8b-4200-92ae-b82c65b24d56\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453250 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-metrics-certs\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453266 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz8mm\" (UniqueName: \"kubernetes.io/projected/5573a041-6f7e-4c23-b2ea-42de01c96cdd-kube-api-access-qz8mm\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453283 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-reloader\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453305 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453326 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-conf\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453348 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics-certs\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.453369 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.453492 4775 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.453551 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist podName:5573a041-6f7e-4c23-b2ea-42de01c96cdd nodeName:}" failed. No retries permitted until 2026-01-27 11:32:59.953527194 +0000 UTC m=+759.095124971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist") pod "speaker-qm9dq" (UID: "5573a041-6f7e-4c23-b2ea-42de01c96cdd") : secret "metallb-memberlist" not found Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.454753 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-sockets\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.454781 4775 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.454851 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-cert podName:de8a1d9c-9c8b-4200-92ae-b82c65b24d56 nodeName:}" failed. No retries permitted until 2026-01-27 11:32:59.95483045 +0000 UTC m=+759.096428227 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-cert") pod "frr-k8s-webhook-server-7df86c4f6c-ht6jz" (UID: "de8a1d9c-9c8b-4200-92ae-b82c65b24d56") : secret "frr-k8s-webhook-server-cert" not found Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.454870 4775 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.454913 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics-certs podName:ac3b8043-04c7-4036-9dc5-6068d914356c nodeName:}" failed. No retries permitted until 2026-01-27 11:32:59.954901822 +0000 UTC m=+759.096499599 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics-certs") pod "frr-k8s-52txr" (UID: "ac3b8043-04c7-4036-9dc5-6068d914356c") : secret "frr-k8s-certs-secret" not found Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.455077 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-reloader\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.455233 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-conf\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.455293 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.455387 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/ac3b8043-04c7-4036-9dc5-6068d914356c-frr-startup\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.455547 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/5573a041-6f7e-4c23-b2ea-42de01c96cdd-metallb-excludel2\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.460023 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-metrics-certs\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.472069 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swgq6\" (UniqueName: \"kubernetes.io/projected/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-kube-api-access-swgq6\") pod \"frr-k8s-webhook-server-7df86c4f6c-ht6jz\" (UID: \"de8a1d9c-9c8b-4200-92ae-b82c65b24d56\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.476862 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz8mm\" (UniqueName: \"kubernetes.io/projected/5573a041-6f7e-4c23-b2ea-42de01c96cdd-kube-api-access-qz8mm\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.479797 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv2hg\" (UniqueName: \"kubernetes.io/projected/ac3b8043-04c7-4036-9dc5-6068d914356c-kube-api-access-fv2hg\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.518054 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.518101 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.554551 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnmwc\" (UniqueName: \"kubernetes.io/projected/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-kube-api-access-cnmwc\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.554641 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-metrics-certs\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.554772 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-cert\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.557521 4775 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.559194 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-metrics-certs\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.568414 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-cert\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.594666 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnmwc\" (UniqueName: \"kubernetes.io/projected/6bd75754-cf96-4b57-bfd3-711aa3dc06e6-kube-api-access-cnmwc\") pod \"controller-6968d8fdc4-4tjsf\" (UID: \"6bd75754-cf96-4b57-bfd3-711aa3dc06e6\") " pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.625733 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.815697 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-4tjsf"] Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.959580 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics-certs\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.959639 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.959716 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-ht6jz\" (UID: \"de8a1d9c-9c8b-4200-92ae-b82c65b24d56\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.960522 4775 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 27 11:32:59 crc kubenswrapper[4775]: E0127 11:32:59.960646 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist podName:5573a041-6f7e-4c23-b2ea-42de01c96cdd nodeName:}" failed. No retries permitted until 2026-01-27 11:33:00.960605855 +0000 UTC m=+760.102203632 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist") pod "speaker-qm9dq" (UID: "5573a041-6f7e-4c23-b2ea-42de01c96cdd") : secret "metallb-memberlist" not found Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.965064 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ac3b8043-04c7-4036-9dc5-6068d914356c-metrics-certs\") pod \"frr-k8s-52txr\" (UID: \"ac3b8043-04c7-4036-9dc5-6068d914356c\") " pod="metallb-system/frr-k8s-52txr" Jan 27 11:32:59 crc kubenswrapper[4775]: I0127 11:32:59.967368 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/de8a1d9c-9c8b-4200-92ae-b82c65b24d56-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-ht6jz\" (UID: \"de8a1d9c-9c8b-4200-92ae-b82c65b24d56\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.145998 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.154069 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-52txr" Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.366831 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz"] Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.391558 4775 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.647463 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" event={"ID":"de8a1d9c-9c8b-4200-92ae-b82c65b24d56","Type":"ContainerStarted","Data":"3835776c46ca061097373c41c47490955b2540c241016c1909a6ce5df5616661"} Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.648347 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerStarted","Data":"7d56a7db9b8bdfab77cc2b371062405f4969a8cdfdb20859159f209b38363b5c"} Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.649897 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-4tjsf" event={"ID":"6bd75754-cf96-4b57-bfd3-711aa3dc06e6","Type":"ContainerStarted","Data":"b3e240f6e2869ea9daf2581758b9fbce1caac2aa2a41f5fa2f06964c4278406b"} Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.649942 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-4tjsf" event={"ID":"6bd75754-cf96-4b57-bfd3-711aa3dc06e6","Type":"ContainerStarted","Data":"23dbcd1d23d30655f6b5395a6550e1b378527870653d8bd7e9162404a9c0b28d"} Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.649953 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-4tjsf" event={"ID":"6bd75754-cf96-4b57-bfd3-711aa3dc06e6","Type":"ContainerStarted","Data":"ee3e6815c2b9377345d90b66b56c0f99e0acd07813c70a75afafefa19d248586"} Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.650069 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.674291 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-4tjsf" podStartSLOduration=1.6742714429999999 podStartE2EDuration="1.674271443s" podCreationTimestamp="2026-01-27 11:32:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:33:00.669933175 +0000 UTC m=+759.811530952" watchObservedRunningTime="2026-01-27 11:33:00.674271443 +0000 UTC m=+759.815869220" Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.972339 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:33:00 crc kubenswrapper[4775]: I0127 11:33:00.986248 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/5573a041-6f7e-4c23-b2ea-42de01c96cdd-memberlist\") pod \"speaker-qm9dq\" (UID: \"5573a041-6f7e-4c23-b2ea-42de01c96cdd\") " pod="metallb-system/speaker-qm9dq" Jan 27 11:33:01 crc kubenswrapper[4775]: I0127 11:33:01.101582 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-qm9dq" Jan 27 11:33:01 crc kubenswrapper[4775]: W0127 11:33:01.157049 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5573a041_6f7e_4c23_b2ea_42de01c96cdd.slice/crio-90905a8c6ac92e742ebbf2425bac13323277f059c7d18b6d6c9bccc306b12567 WatchSource:0}: Error finding container 90905a8c6ac92e742ebbf2425bac13323277f059c7d18b6d6c9bccc306b12567: Status 404 returned error can't find the container with id 90905a8c6ac92e742ebbf2425bac13323277f059c7d18b6d6c9bccc306b12567 Jan 27 11:33:01 crc kubenswrapper[4775]: I0127 11:33:01.659157 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qm9dq" event={"ID":"5573a041-6f7e-4c23-b2ea-42de01c96cdd","Type":"ContainerStarted","Data":"3c9e971f1c524bbd200da1957ca5f480fa8d28a840f5e1dcf956e8b53e340463"} Jan 27 11:33:01 crc kubenswrapper[4775]: I0127 11:33:01.659474 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qm9dq" event={"ID":"5573a041-6f7e-4c23-b2ea-42de01c96cdd","Type":"ContainerStarted","Data":"90905a8c6ac92e742ebbf2425bac13323277f059c7d18b6d6c9bccc306b12567"} Jan 27 11:33:02 crc kubenswrapper[4775]: I0127 11:33:02.677151 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-qm9dq" event={"ID":"5573a041-6f7e-4c23-b2ea-42de01c96cdd","Type":"ContainerStarted","Data":"0c371d696aeaa53d0296327cd7c9b2c25fe3b3f085e206b979eb205dbf6d192e"} Jan 27 11:33:02 crc kubenswrapper[4775]: I0127 11:33:02.677293 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-qm9dq" Jan 27 11:33:02 crc kubenswrapper[4775]: I0127 11:33:02.698109 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-qm9dq" podStartSLOduration=3.698088243 podStartE2EDuration="3.698088243s" podCreationTimestamp="2026-01-27 11:32:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:33:02.696361267 +0000 UTC m=+761.837959054" watchObservedRunningTime="2026-01-27 11:33:02.698088243 +0000 UTC m=+761.839686020" Jan 27 11:33:07 crc kubenswrapper[4775]: I0127 11:33:07.720219 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" event={"ID":"de8a1d9c-9c8b-4200-92ae-b82c65b24d56","Type":"ContainerStarted","Data":"c170a171d6cf9bc9c8588394f975859d58cad8b1bf82fd83c02fad87aed36ace"} Jan 27 11:33:07 crc kubenswrapper[4775]: I0127 11:33:07.720878 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:33:07 crc kubenswrapper[4775]: I0127 11:33:07.722229 4775 generic.go:334] "Generic (PLEG): container finished" podID="ac3b8043-04c7-4036-9dc5-6068d914356c" containerID="64be078422524f86eb16ffe658bb47bef69ebfd330b5afa90579cb1d29df9506" exitCode=0 Jan 27 11:33:07 crc kubenswrapper[4775]: I0127 11:33:07.722294 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerDied","Data":"64be078422524f86eb16ffe658bb47bef69ebfd330b5afa90579cb1d29df9506"} Jan 27 11:33:07 crc kubenswrapper[4775]: I0127 11:33:07.745868 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" podStartSLOduration=1.720765268 podStartE2EDuration="8.745849696s" podCreationTimestamp="2026-01-27 11:32:59 +0000 UTC" firstStartedPulling="2026-01-27 11:33:00.374178959 +0000 UTC m=+759.515776736" lastFinishedPulling="2026-01-27 11:33:07.399263367 +0000 UTC m=+766.540861164" observedRunningTime="2026-01-27 11:33:07.744917171 +0000 UTC m=+766.886514968" watchObservedRunningTime="2026-01-27 11:33:07.745849696 +0000 UTC m=+766.887447473" Jan 27 11:33:08 crc kubenswrapper[4775]: I0127 11:33:08.731731 4775 generic.go:334] "Generic (PLEG): container finished" podID="ac3b8043-04c7-4036-9dc5-6068d914356c" containerID="99670cfcfebfec066858b73a8de9b5d78b915baeb44fecb97e2434c4b60cf99f" exitCode=0 Jan 27 11:33:08 crc kubenswrapper[4775]: I0127 11:33:08.731795 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerDied","Data":"99670cfcfebfec066858b73a8de9b5d78b915baeb44fecb97e2434c4b60cf99f"} Jan 27 11:33:09 crc kubenswrapper[4775]: I0127 11:33:09.630237 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-4tjsf" Jan 27 11:33:09 crc kubenswrapper[4775]: I0127 11:33:09.740678 4775 generic.go:334] "Generic (PLEG): container finished" podID="ac3b8043-04c7-4036-9dc5-6068d914356c" containerID="d40b5fef1225b0a1cebb8e29b3c22bf788afccb31b8eae58ec7b61a1aa769377" exitCode=0 Jan 27 11:33:09 crc kubenswrapper[4775]: I0127 11:33:09.740741 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerDied","Data":"d40b5fef1225b0a1cebb8e29b3c22bf788afccb31b8eae58ec7b61a1aa769377"} Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.749929 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerStarted","Data":"9b1bb7b622b0338596a54aa140d2e3bdaec2cd508f17f9a25c3c35c890201291"} Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.750276 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-52txr" Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.750291 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerStarted","Data":"129f9cb7db24ef4a502b80e9fe53a32bc180e3ee8cce85a61d023252411b02e2"} Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.750304 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerStarted","Data":"ab4938dd6124bfe745b22865b226f2933fcab80254ba4d55a4e5374c49b45c26"} Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.750314 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerStarted","Data":"ab4f112b75b075a1e0f42c35fd3faf0cbac80210ac42b5d70fe68cdef4bb8f06"} Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.750324 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerStarted","Data":"84fa7fb410daaa73523d8f0ceaa0a5a5b1a69aca31b1824f97cceb86ddbd2a94"} Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.750333 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-52txr" event={"ID":"ac3b8043-04c7-4036-9dc5-6068d914356c","Type":"ContainerStarted","Data":"e54609e76012448cc9b14400734594a7b3db675e82a2974443932e69d30c8599"} Jan 27 11:33:10 crc kubenswrapper[4775]: I0127 11:33:10.779637 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-52txr" podStartSLOduration=4.657806273 podStartE2EDuration="11.779621715s" podCreationTimestamp="2026-01-27 11:32:59 +0000 UTC" firstStartedPulling="2026-01-27 11:33:00.296680199 +0000 UTC m=+759.438277966" lastFinishedPulling="2026-01-27 11:33:07.418495631 +0000 UTC m=+766.560093408" observedRunningTime="2026-01-27 11:33:10.774934248 +0000 UTC m=+769.916532035" watchObservedRunningTime="2026-01-27 11:33:10.779621715 +0000 UTC m=+769.921219492" Jan 27 11:33:11 crc kubenswrapper[4775]: I0127 11:33:11.106019 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-qm9dq" Jan 27 11:33:13 crc kubenswrapper[4775]: I0127 11:33:13.861686 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wlhh5"] Jan 27 11:33:13 crc kubenswrapper[4775]: I0127 11:33:13.862885 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wlhh5" Jan 27 11:33:13 crc kubenswrapper[4775]: I0127 11:33:13.865424 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-9zhfn" Jan 27 11:33:13 crc kubenswrapper[4775]: I0127 11:33:13.865530 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 27 11:33:13 crc kubenswrapper[4775]: I0127 11:33:13.865656 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 27 11:33:13 crc kubenswrapper[4775]: I0127 11:33:13.877280 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wlhh5"] Jan 27 11:33:13 crc kubenswrapper[4775]: I0127 11:33:13.949245 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftr7f\" (UniqueName: \"kubernetes.io/projected/024a719c-d757-41d5-b790-fc3c75d0b4ee-kube-api-access-ftr7f\") pod \"openstack-operator-index-wlhh5\" (UID: \"024a719c-d757-41d5-b790-fc3c75d0b4ee\") " pod="openstack-operators/openstack-operator-index-wlhh5" Jan 27 11:33:14 crc kubenswrapper[4775]: I0127 11:33:14.050719 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftr7f\" (UniqueName: \"kubernetes.io/projected/024a719c-d757-41d5-b790-fc3c75d0b4ee-kube-api-access-ftr7f\") pod \"openstack-operator-index-wlhh5\" (UID: \"024a719c-d757-41d5-b790-fc3c75d0b4ee\") " pod="openstack-operators/openstack-operator-index-wlhh5" Jan 27 11:33:14 crc kubenswrapper[4775]: I0127 11:33:14.067582 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftr7f\" (UniqueName: \"kubernetes.io/projected/024a719c-d757-41d5-b790-fc3c75d0b4ee-kube-api-access-ftr7f\") pod \"openstack-operator-index-wlhh5\" (UID: \"024a719c-d757-41d5-b790-fc3c75d0b4ee\") " pod="openstack-operators/openstack-operator-index-wlhh5" Jan 27 11:33:14 crc kubenswrapper[4775]: I0127 11:33:14.210799 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wlhh5" Jan 27 11:33:14 crc kubenswrapper[4775]: I0127 11:33:14.639734 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wlhh5"] Jan 27 11:33:14 crc kubenswrapper[4775]: W0127 11:33:14.648975 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod024a719c_d757_41d5_b790_fc3c75d0b4ee.slice/crio-ac9957f2ff3ed0ec5d0f41438efc2afaa4f632e316459a08f98cdbb50bc737c3 WatchSource:0}: Error finding container ac9957f2ff3ed0ec5d0f41438efc2afaa4f632e316459a08f98cdbb50bc737c3: Status 404 returned error can't find the container with id ac9957f2ff3ed0ec5d0f41438efc2afaa4f632e316459a08f98cdbb50bc737c3 Jan 27 11:33:14 crc kubenswrapper[4775]: I0127 11:33:14.790757 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wlhh5" event={"ID":"024a719c-d757-41d5-b790-fc3c75d0b4ee","Type":"ContainerStarted","Data":"ac9957f2ff3ed0ec5d0f41438efc2afaa4f632e316459a08f98cdbb50bc737c3"} Jan 27 11:33:15 crc kubenswrapper[4775]: I0127 11:33:15.154742 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-52txr" Jan 27 11:33:15 crc kubenswrapper[4775]: I0127 11:33:15.193293 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-52txr" Jan 27 11:33:17 crc kubenswrapper[4775]: I0127 11:33:17.242492 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wlhh5"] Jan 27 11:33:17 crc kubenswrapper[4775]: I0127 11:33:17.815434 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wlhh5" event={"ID":"024a719c-d757-41d5-b790-fc3c75d0b4ee","Type":"ContainerStarted","Data":"d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95"} Jan 27 11:33:17 crc kubenswrapper[4775]: I0127 11:33:17.837422 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wlhh5" podStartSLOduration=2.794700637 podStartE2EDuration="4.837394363s" podCreationTimestamp="2026-01-27 11:33:13 +0000 UTC" firstStartedPulling="2026-01-27 11:33:14.651033938 +0000 UTC m=+773.792631755" lastFinishedPulling="2026-01-27 11:33:16.693727704 +0000 UTC m=+775.835325481" observedRunningTime="2026-01-27 11:33:17.832742217 +0000 UTC m=+776.974340014" watchObservedRunningTime="2026-01-27 11:33:17.837394363 +0000 UTC m=+776.978992140" Jan 27 11:33:17 crc kubenswrapper[4775]: I0127 11:33:17.854216 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-swjcb"] Jan 27 11:33:17 crc kubenswrapper[4775]: I0127 11:33:17.855198 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:17 crc kubenswrapper[4775]: I0127 11:33:17.862605 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-swjcb"] Jan 27 11:33:17 crc kubenswrapper[4775]: I0127 11:33:17.899987 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cxmh\" (UniqueName: \"kubernetes.io/projected/56b44f0b-813c-4626-a8ec-54ac78bbb086-kube-api-access-8cxmh\") pod \"openstack-operator-index-swjcb\" (UID: \"56b44f0b-813c-4626-a8ec-54ac78bbb086\") " pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:18 crc kubenswrapper[4775]: I0127 11:33:18.002352 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cxmh\" (UniqueName: \"kubernetes.io/projected/56b44f0b-813c-4626-a8ec-54ac78bbb086-kube-api-access-8cxmh\") pod \"openstack-operator-index-swjcb\" (UID: \"56b44f0b-813c-4626-a8ec-54ac78bbb086\") " pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:18 crc kubenswrapper[4775]: I0127 11:33:18.036716 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cxmh\" (UniqueName: \"kubernetes.io/projected/56b44f0b-813c-4626-a8ec-54ac78bbb086-kube-api-access-8cxmh\") pod \"openstack-operator-index-swjcb\" (UID: \"56b44f0b-813c-4626-a8ec-54ac78bbb086\") " pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:18 crc kubenswrapper[4775]: I0127 11:33:18.186917 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:18 crc kubenswrapper[4775]: I0127 11:33:18.493938 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-swjcb"] Jan 27 11:33:18 crc kubenswrapper[4775]: W0127 11:33:18.500805 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56b44f0b_813c_4626_a8ec_54ac78bbb086.slice/crio-0ef6cac96f3b0cd5c26c10df66c6a7d7ca41e1f5cf7f4e4825fcda675017eeec WatchSource:0}: Error finding container 0ef6cac96f3b0cd5c26c10df66c6a7d7ca41e1f5cf7f4e4825fcda675017eeec: Status 404 returned error can't find the container with id 0ef6cac96f3b0cd5c26c10df66c6a7d7ca41e1f5cf7f4e4825fcda675017eeec Jan 27 11:33:18 crc kubenswrapper[4775]: I0127 11:33:18.822599 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-swjcb" event={"ID":"56b44f0b-813c-4626-a8ec-54ac78bbb086","Type":"ContainerStarted","Data":"0ef6cac96f3b0cd5c26c10df66c6a7d7ca41e1f5cf7f4e4825fcda675017eeec"} Jan 27 11:33:18 crc kubenswrapper[4775]: I0127 11:33:18.822771 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wlhh5" podUID="024a719c-d757-41d5-b790-fc3c75d0b4ee" containerName="registry-server" containerID="cri-o://d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95" gracePeriod=2 Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.285194 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wlhh5" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.424934 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftr7f\" (UniqueName: \"kubernetes.io/projected/024a719c-d757-41d5-b790-fc3c75d0b4ee-kube-api-access-ftr7f\") pod \"024a719c-d757-41d5-b790-fc3c75d0b4ee\" (UID: \"024a719c-d757-41d5-b790-fc3c75d0b4ee\") " Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.434079 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/024a719c-d757-41d5-b790-fc3c75d0b4ee-kube-api-access-ftr7f" (OuterVolumeSpecName: "kube-api-access-ftr7f") pod "024a719c-d757-41d5-b790-fc3c75d0b4ee" (UID: "024a719c-d757-41d5-b790-fc3c75d0b4ee"). InnerVolumeSpecName "kube-api-access-ftr7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.527171 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftr7f\" (UniqueName: \"kubernetes.io/projected/024a719c-d757-41d5-b790-fc3c75d0b4ee-kube-api-access-ftr7f\") on node \"crc\" DevicePath \"\"" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.832712 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-swjcb" event={"ID":"56b44f0b-813c-4626-a8ec-54ac78bbb086","Type":"ContainerStarted","Data":"78412061b1d15aeddf791256458359d5cc017abc1289bc61fa8ae1a5e63d4ab4"} Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.834841 4775 generic.go:334] "Generic (PLEG): container finished" podID="024a719c-d757-41d5-b790-fc3c75d0b4ee" containerID="d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95" exitCode=0 Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.834875 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wlhh5" event={"ID":"024a719c-d757-41d5-b790-fc3c75d0b4ee","Type":"ContainerDied","Data":"d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95"} Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.834895 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wlhh5" event={"ID":"024a719c-d757-41d5-b790-fc3c75d0b4ee","Type":"ContainerDied","Data":"ac9957f2ff3ed0ec5d0f41438efc2afaa4f632e316459a08f98cdbb50bc737c3"} Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.834913 4775 scope.go:117] "RemoveContainer" containerID="d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.835010 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wlhh5" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.877599 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-swjcb" podStartSLOduration=2.496677256 podStartE2EDuration="2.877572251s" podCreationTimestamp="2026-01-27 11:33:17 +0000 UTC" firstStartedPulling="2026-01-27 11:33:18.505964112 +0000 UTC m=+777.647561889" lastFinishedPulling="2026-01-27 11:33:18.886859057 +0000 UTC m=+778.028456884" observedRunningTime="2026-01-27 11:33:19.855558001 +0000 UTC m=+778.997155808" watchObservedRunningTime="2026-01-27 11:33:19.877572251 +0000 UTC m=+779.019170068" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.882182 4775 scope.go:117] "RemoveContainer" containerID="d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95" Jan 27 11:33:19 crc kubenswrapper[4775]: E0127 11:33:19.882919 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95\": container with ID starting with d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95 not found: ID does not exist" containerID="d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.882993 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95"} err="failed to get container status \"d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95\": rpc error: code = NotFound desc = could not find container \"d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95\": container with ID starting with d23a98937f1d6eb720c34bf0804df8e3e2b12303d5e773f04eb31075eb2b6a95 not found: ID does not exist" Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.883758 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wlhh5"] Jan 27 11:33:19 crc kubenswrapper[4775]: I0127 11:33:19.893853 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-wlhh5"] Jan 27 11:33:20 crc kubenswrapper[4775]: I0127 11:33:20.153091 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-ht6jz" Jan 27 11:33:20 crc kubenswrapper[4775]: I0127 11:33:20.157619 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-52txr" Jan 27 11:33:21 crc kubenswrapper[4775]: I0127 11:33:21.760808 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="024a719c-d757-41d5-b790-fc3c75d0b4ee" path="/var/lib/kubelet/pods/024a719c-d757-41d5-b790-fc3c75d0b4ee/volumes" Jan 27 11:33:28 crc kubenswrapper[4775]: I0127 11:33:28.187506 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:28 crc kubenswrapper[4775]: I0127 11:33:28.188145 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:28 crc kubenswrapper[4775]: I0127 11:33:28.234580 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:28 crc kubenswrapper[4775]: I0127 11:33:28.934294 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-swjcb" Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.517880 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.518752 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.518797 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.519291 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2871a1c3582de4c70e2186866f517a9085c1741422622dc5d1e02969b09f93ad"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.519343 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://2871a1c3582de4c70e2186866f517a9085c1741422622dc5d1e02969b09f93ad" gracePeriod=600 Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.909208 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="2871a1c3582de4c70e2186866f517a9085c1741422622dc5d1e02969b09f93ad" exitCode=0 Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.909400 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"2871a1c3582de4c70e2186866f517a9085c1741422622dc5d1e02969b09f93ad"} Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.910129 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"d3e646652935035e4ff54edd9c0e89ba4aba219ed8931315dc5dc4069b80f310"} Jan 27 11:33:29 crc kubenswrapper[4775]: I0127 11:33:29.910231 4775 scope.go:117] "RemoveContainer" containerID="b6bfc560dd2b425e637beb4eff36549cfb04f80cf81bd519c26996484ee2498d" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.340049 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9"] Jan 27 11:33:36 crc kubenswrapper[4775]: E0127 11:33:36.340902 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="024a719c-d757-41d5-b790-fc3c75d0b4ee" containerName="registry-server" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.340919 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="024a719c-d757-41d5-b790-fc3c75d0b4ee" containerName="registry-server" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.341066 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="024a719c-d757-41d5-b790-fc3c75d0b4ee" containerName="registry-server" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.341965 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.345779 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gn5z5" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.366104 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9"] Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.480643 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-bundle\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.480733 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-util\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.480835 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n75pv\" (UniqueName: \"kubernetes.io/projected/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-kube-api-access-n75pv\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.582028 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-bundle\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.582090 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-util\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.582153 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n75pv\" (UniqueName: \"kubernetes.io/projected/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-kube-api-access-n75pv\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.582703 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-bundle\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.582739 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-util\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.608735 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n75pv\" (UniqueName: \"kubernetes.io/projected/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-kube-api-access-n75pv\") pod \"b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:36 crc kubenswrapper[4775]: I0127 11:33:36.670446 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:37 crc kubenswrapper[4775]: I0127 11:33:37.137260 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9"] Jan 27 11:33:37 crc kubenswrapper[4775]: I0127 11:33:37.986967 4775 generic.go:334] "Generic (PLEG): container finished" podID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerID="36c33e59210f8597590e58ad1640b9d860f65abc02a3ec940c173013312f6d4e" exitCode=0 Jan 27 11:33:37 crc kubenswrapper[4775]: I0127 11:33:37.987080 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" event={"ID":"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd","Type":"ContainerDied","Data":"36c33e59210f8597590e58ad1640b9d860f65abc02a3ec940c173013312f6d4e"} Jan 27 11:33:37 crc kubenswrapper[4775]: I0127 11:33:37.987333 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" event={"ID":"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd","Type":"ContainerStarted","Data":"20e7437ae880be1bf8b4b306976215c9dac277d8f8aed6df3b7f71a1162ab2b0"} Jan 27 11:33:38 crc kubenswrapper[4775]: I0127 11:33:38.994255 4775 generic.go:334] "Generic (PLEG): container finished" podID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerID="29316e19d76aade267caba578e797b4bd9ecc8d4d9e7f4f92d321dc5e0a535e5" exitCode=0 Jan 27 11:33:38 crc kubenswrapper[4775]: I0127 11:33:38.994492 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" event={"ID":"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd","Type":"ContainerDied","Data":"29316e19d76aade267caba578e797b4bd9ecc8d4d9e7f4f92d321dc5e0a535e5"} Jan 27 11:33:40 crc kubenswrapper[4775]: I0127 11:33:40.001990 4775 generic.go:334] "Generic (PLEG): container finished" podID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerID="3b517171e2f225019f41f9077a13977c7d266c910909e4f7a2dd8f129053e996" exitCode=0 Jan 27 11:33:40 crc kubenswrapper[4775]: I0127 11:33:40.002041 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" event={"ID":"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd","Type":"ContainerDied","Data":"3b517171e2f225019f41f9077a13977c7d266c910909e4f7a2dd8f129053e996"} Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.313595 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.450654 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n75pv\" (UniqueName: \"kubernetes.io/projected/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-kube-api-access-n75pv\") pod \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.450747 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-bundle\") pod \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.450867 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-util\") pod \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\" (UID: \"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd\") " Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.451531 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-bundle" (OuterVolumeSpecName: "bundle") pod "dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" (UID: "dcd9d0e9-c9de-479d-b62f-f4403ffa22dd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.457026 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-kube-api-access-n75pv" (OuterVolumeSpecName: "kube-api-access-n75pv") pod "dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" (UID: "dcd9d0e9-c9de-479d-b62f-f4403ffa22dd"). InnerVolumeSpecName "kube-api-access-n75pv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.465114 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-util" (OuterVolumeSpecName: "util") pod "dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" (UID: "dcd9d0e9-c9de-479d-b62f-f4403ffa22dd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.567955 4775 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-util\") on node \"crc\" DevicePath \"\"" Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.568025 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n75pv\" (UniqueName: \"kubernetes.io/projected/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-kube-api-access-n75pv\") on node \"crc\" DevicePath \"\"" Jan 27 11:33:41 crc kubenswrapper[4775]: I0127 11:33:41.568043 4775 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcd9d0e9-c9de-479d-b62f-f4403ffa22dd-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:33:42 crc kubenswrapper[4775]: I0127 11:33:42.018960 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" event={"ID":"dcd9d0e9-c9de-479d-b62f-f4403ffa22dd","Type":"ContainerDied","Data":"20e7437ae880be1bf8b4b306976215c9dac277d8f8aed6df3b7f71a1162ab2b0"} Jan 27 11:33:42 crc kubenswrapper[4775]: I0127 11:33:42.019388 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20e7437ae880be1bf8b4b306976215c9dac277d8f8aed6df3b7f71a1162ab2b0" Jan 27 11:33:42 crc kubenswrapper[4775]: I0127 11:33:42.019343 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9" Jan 27 11:33:48 crc kubenswrapper[4775]: I0127 11:33:48.991233 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8"] Jan 27 11:33:48 crc kubenswrapper[4775]: E0127 11:33:48.992047 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerName="util" Jan 27 11:33:48 crc kubenswrapper[4775]: I0127 11:33:48.992063 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerName="util" Jan 27 11:33:48 crc kubenswrapper[4775]: E0127 11:33:48.992078 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerName="pull" Jan 27 11:33:48 crc kubenswrapper[4775]: I0127 11:33:48.992085 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerName="pull" Jan 27 11:33:48 crc kubenswrapper[4775]: E0127 11:33:48.992104 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerName="extract" Jan 27 11:33:48 crc kubenswrapper[4775]: I0127 11:33:48.992114 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerName="extract" Jan 27 11:33:48 crc kubenswrapper[4775]: I0127 11:33:48.992246 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcd9d0e9-c9de-479d-b62f-f4403ffa22dd" containerName="extract" Jan 27 11:33:48 crc kubenswrapper[4775]: I0127 11:33:48.992748 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" Jan 27 11:33:48 crc kubenswrapper[4775]: I0127 11:33:48.994810 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-qhtfs" Jan 27 11:33:49 crc kubenswrapper[4775]: I0127 11:33:49.020837 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8"] Jan 27 11:33:49 crc kubenswrapper[4775]: I0127 11:33:49.088025 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mp8k\" (UniqueName: \"kubernetes.io/projected/8868fb89-f25b-48ef-b4e2-9acab9f78790-kube-api-access-8mp8k\") pod \"openstack-operator-controller-init-6bfcf7b875-z4vw8\" (UID: \"8868fb89-f25b-48ef-b4e2-9acab9f78790\") " pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" Jan 27 11:33:49 crc kubenswrapper[4775]: I0127 11:33:49.188851 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mp8k\" (UniqueName: \"kubernetes.io/projected/8868fb89-f25b-48ef-b4e2-9acab9f78790-kube-api-access-8mp8k\") pod \"openstack-operator-controller-init-6bfcf7b875-z4vw8\" (UID: \"8868fb89-f25b-48ef-b4e2-9acab9f78790\") " pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" Jan 27 11:33:49 crc kubenswrapper[4775]: I0127 11:33:49.215357 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mp8k\" (UniqueName: \"kubernetes.io/projected/8868fb89-f25b-48ef-b4e2-9acab9f78790-kube-api-access-8mp8k\") pod \"openstack-operator-controller-init-6bfcf7b875-z4vw8\" (UID: \"8868fb89-f25b-48ef-b4e2-9acab9f78790\") " pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" Jan 27 11:33:49 crc kubenswrapper[4775]: I0127 11:33:49.327698 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" Jan 27 11:33:49 crc kubenswrapper[4775]: I0127 11:33:49.863099 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8"] Jan 27 11:33:50 crc kubenswrapper[4775]: I0127 11:33:50.072180 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" event={"ID":"8868fb89-f25b-48ef-b4e2-9acab9f78790","Type":"ContainerStarted","Data":"fccdbb5cfc07cddd2b172c42a0dbc3411689c4628eae3d3bb86468e0faed9304"} Jan 27 11:33:55 crc kubenswrapper[4775]: I0127 11:33:55.106514 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" event={"ID":"8868fb89-f25b-48ef-b4e2-9acab9f78790","Type":"ContainerStarted","Data":"9902d0c5c5d528c83a34f60d79d942798887e3d652681eafbc131a6c7ceeb030"} Jan 27 11:33:55 crc kubenswrapper[4775]: I0127 11:33:55.107280 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" Jan 27 11:33:55 crc kubenswrapper[4775]: I0127 11:33:55.151938 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" podStartSLOduration=2.908720325 podStartE2EDuration="7.151915575s" podCreationTimestamp="2026-01-27 11:33:48 +0000 UTC" firstStartedPulling="2026-01-27 11:33:49.880148091 +0000 UTC m=+809.021745868" lastFinishedPulling="2026-01-27 11:33:54.123343341 +0000 UTC m=+813.264941118" observedRunningTime="2026-01-27 11:33:55.135371545 +0000 UTC m=+814.276969322" watchObservedRunningTime="2026-01-27 11:33:55.151915575 +0000 UTC m=+814.293513352" Jan 27 11:33:59 crc kubenswrapper[4775]: I0127 11:33:59.330617 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6bfcf7b875-z4vw8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.003456 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.004545 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.007115 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-qdzw9" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.014302 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.015286 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.016963 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-vn2x4" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.018122 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.041891 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.046352 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.047069 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.061529 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-bpwv8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.097349 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.098075 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.102729 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-hr54n" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.103526 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.104715 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.106786 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-dzxkr" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.121853 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.146780 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.150088 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nr5q\" (UniqueName: \"kubernetes.io/projected/c31d5b06-1ad2-4914-96c1-e0f0b8c4974e-kube-api-access-6nr5q\") pod \"designate-operator-controller-manager-76d4d5b8f9-dvj9s\" (UID: \"c31d5b06-1ad2-4914-96c1-e0f0b8c4974e\") " pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.150160 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-524xg\" (UniqueName: \"kubernetes.io/projected/04cbcc0c-4375-44f0-9461-b43492e9d95b-kube-api-access-524xg\") pod \"barbican-operator-controller-manager-75b8f798ff-t29z2\" (UID: \"04cbcc0c-4375-44f0-9461-b43492e9d95b\") " pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.150185 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t9v4\" (UniqueName: \"kubernetes.io/projected/dd9264fb-034f-46d3-8698-dcc6fc3470f6-kube-api-access-7t9v4\") pod \"heat-operator-controller-manager-658dd65b86-jp5c7\" (UID: \"dd9264fb-034f-46d3-8698-dcc6fc3470f6\") " pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.150216 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrdxk\" (UniqueName: \"kubernetes.io/projected/f04fa2a0-7af2-439a-9169-6edf5be65b35-kube-api-access-wrdxk\") pod \"cinder-operator-controller-manager-5fdc687f5-9wc4j\" (UID: \"f04fa2a0-7af2-439a-9169-6edf5be65b35\") " pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.157328 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.166032 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.167048 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.185959 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-j58ml" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.222975 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.223948 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.230571 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-slgvx" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.246608 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.247424 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.248926 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.249330 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-5sbqs" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.251799 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrdxk\" (UniqueName: \"kubernetes.io/projected/f04fa2a0-7af2-439a-9169-6edf5be65b35-kube-api-access-wrdxk\") pod \"cinder-operator-controller-manager-5fdc687f5-9wc4j\" (UID: \"f04fa2a0-7af2-439a-9169-6edf5be65b35\") " pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.251882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gr7f\" (UniqueName: \"kubernetes.io/projected/703a739a-6687-4324-b937-7d0efe7c143b-kube-api-access-2gr7f\") pod \"horizon-operator-controller-manager-7f5ddd8d7b-58qnd\" (UID: \"703a739a-6687-4324-b937-7d0efe7c143b\") " pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.251921 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nr5q\" (UniqueName: \"kubernetes.io/projected/c31d5b06-1ad2-4914-96c1-e0f0b8c4974e-kube-api-access-6nr5q\") pod \"designate-operator-controller-manager-76d4d5b8f9-dvj9s\" (UID: \"c31d5b06-1ad2-4914-96c1-e0f0b8c4974e\") " pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.251983 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdtl\" (UniqueName: \"kubernetes.io/projected/0cabb338-c4a1-41b4-abd6-d535b0e88406-kube-api-access-vgdtl\") pod \"glance-operator-controller-manager-84d5bb46b-cvp5b\" (UID: \"0cabb338-c4a1-41b4-abd6-d535b0e88406\") " pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.252022 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-524xg\" (UniqueName: \"kubernetes.io/projected/04cbcc0c-4375-44f0-9461-b43492e9d95b-kube-api-access-524xg\") pod \"barbican-operator-controller-manager-75b8f798ff-t29z2\" (UID: \"04cbcc0c-4375-44f0-9461-b43492e9d95b\") " pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.252050 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t9v4\" (UniqueName: \"kubernetes.io/projected/dd9264fb-034f-46d3-8698-dcc6fc3470f6-kube-api-access-7t9v4\") pod \"heat-operator-controller-manager-658dd65b86-jp5c7\" (UID: \"dd9264fb-034f-46d3-8698-dcc6fc3470f6\") " pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.256160 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.277773 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.279098 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.280971 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-5dfsq" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.282067 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t9v4\" (UniqueName: \"kubernetes.io/projected/dd9264fb-034f-46d3-8698-dcc6fc3470f6-kube-api-access-7t9v4\") pod \"heat-operator-controller-manager-658dd65b86-jp5c7\" (UID: \"dd9264fb-034f-46d3-8698-dcc6fc3470f6\") " pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.284075 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrdxk\" (UniqueName: \"kubernetes.io/projected/f04fa2a0-7af2-439a-9169-6edf5be65b35-kube-api-access-wrdxk\") pod \"cinder-operator-controller-manager-5fdc687f5-9wc4j\" (UID: \"f04fa2a0-7af2-439a-9169-6edf5be65b35\") " pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.293763 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nr5q\" (UniqueName: \"kubernetes.io/projected/c31d5b06-1ad2-4914-96c1-e0f0b8c4974e-kube-api-access-6nr5q\") pod \"designate-operator-controller-manager-76d4d5b8f9-dvj9s\" (UID: \"c31d5b06-1ad2-4914-96c1-e0f0b8c4974e\") " pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.305155 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.335079 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-524xg\" (UniqueName: \"kubernetes.io/projected/04cbcc0c-4375-44f0-9461-b43492e9d95b-kube-api-access-524xg\") pod \"barbican-operator-controller-manager-75b8f798ff-t29z2\" (UID: \"04cbcc0c-4375-44f0-9461-b43492e9d95b\") " pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.345755 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.358779 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.358831 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw89t\" (UniqueName: \"kubernetes.io/projected/b296a3cd-1dc1-4511-af7a-7b1801e23e61-kube-api-access-sw89t\") pod \"ironic-operator-controller-manager-58865f87b4-s2l5z\" (UID: \"b296a3cd-1dc1-4511-af7a-7b1801e23e61\") " pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.358891 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gr7f\" (UniqueName: \"kubernetes.io/projected/703a739a-6687-4324-b937-7d0efe7c143b-kube-api-access-2gr7f\") pod \"horizon-operator-controller-manager-7f5ddd8d7b-58qnd\" (UID: \"703a739a-6687-4324-b937-7d0efe7c143b\") " pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.358929 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7rjf\" (UniqueName: \"kubernetes.io/projected/0da235e3-e76a-408f-8e0e-3cdd7ce76705-kube-api-access-q7rjf\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.358976 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgdtl\" (UniqueName: \"kubernetes.io/projected/0cabb338-c4a1-41b4-abd6-d535b0e88406-kube-api-access-vgdtl\") pod \"glance-operator-controller-manager-84d5bb46b-cvp5b\" (UID: \"0cabb338-c4a1-41b4-abd6-d535b0e88406\") " pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.359002 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5djl\" (UniqueName: \"kubernetes.io/projected/4e719fbd-ac18-4ae1-bac6-c42f1e081daa-kube-api-access-k5djl\") pod \"keystone-operator-controller-manager-78f8b7b89c-2wqgg\" (UID: \"4e719fbd-ac18-4ae1-bac6-c42f1e081daa\") " pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.397833 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gr7f\" (UniqueName: \"kubernetes.io/projected/703a739a-6687-4324-b937-7d0efe7c143b-kube-api-access-2gr7f\") pod \"horizon-operator-controller-manager-7f5ddd8d7b-58qnd\" (UID: \"703a739a-6687-4324-b937-7d0efe7c143b\") " pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.398261 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.398606 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.401139 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgdtl\" (UniqueName: \"kubernetes.io/projected/0cabb338-c4a1-41b4-abd6-d535b0e88406-kube-api-access-vgdtl\") pod \"glance-operator-controller-manager-84d5bb46b-cvp5b\" (UID: \"0cabb338-c4a1-41b4-abd6-d535b0e88406\") " pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.411589 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.412744 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.418222 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-69nmj" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.423716 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.434128 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.445343 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.446135 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.448520 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-6d7xh" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.452062 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.455904 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.466071 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.466885 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7rjf\" (UniqueName: \"kubernetes.io/projected/0da235e3-e76a-408f-8e0e-3cdd7ce76705-kube-api-access-q7rjf\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.467010 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkbw7\" (UniqueName: \"kubernetes.io/projected/6c5084e4-b0e1-46fd-ae69-c0f2ede3db17-kube-api-access-fkbw7\") pod \"manila-operator-controller-manager-78b8f8fd84-8xrd7\" (UID: \"6c5084e4-b0e1-46fd-ae69-c0f2ede3db17\") " pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.467053 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5djl\" (UniqueName: \"kubernetes.io/projected/4e719fbd-ac18-4ae1-bac6-c42f1e081daa-kube-api-access-k5djl\") pod \"keystone-operator-controller-manager-78f8b7b89c-2wqgg\" (UID: \"4e719fbd-ac18-4ae1-bac6-c42f1e081daa\") " pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.467088 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.467152 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.467173 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw89t\" (UniqueName: \"kubernetes.io/projected/b296a3cd-1dc1-4511-af7a-7b1801e23e61-kube-api-access-sw89t\") pod \"ironic-operator-controller-manager-58865f87b4-s2l5z\" (UID: \"b296a3cd-1dc1-4511-af7a-7b1801e23e61\") " pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.467513 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.467567 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert podName:0da235e3-e76a-408f-8e0e-3cdd7ce76705 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:37.967550461 +0000 UTC m=+857.109148238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert") pod "infra-operator-controller-manager-54ccf4f85d-d7vhk" (UID: "0da235e3-e76a-408f-8e0e-3cdd7ce76705") : secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.473498 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.477146 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-c9b2k" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.481122 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.484617 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7rjf\" (UniqueName: \"kubernetes.io/projected/0da235e3-e76a-408f-8e0e-3cdd7ce76705-kube-api-access-q7rjf\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.485352 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5djl\" (UniqueName: \"kubernetes.io/projected/4e719fbd-ac18-4ae1-bac6-c42f1e081daa-kube-api-access-k5djl\") pod \"keystone-operator-controller-manager-78f8b7b89c-2wqgg\" (UID: \"4e719fbd-ac18-4ae1-bac6-c42f1e081daa\") " pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.485712 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.487146 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw89t\" (UniqueName: \"kubernetes.io/projected/b296a3cd-1dc1-4511-af7a-7b1801e23e61-kube-api-access-sw89t\") pod \"ironic-operator-controller-manager-58865f87b4-s2l5z\" (UID: \"b296a3cd-1dc1-4511-af7a-7b1801e23e61\") " pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.500768 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.501652 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.503216 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-mghw4" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.506089 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.514868 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.525097 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.525198 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.526075 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.527303 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-f4vht" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.529901 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.530669 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.535393 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-2xknr" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.539275 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.541978 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.542831 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.545363 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.546333 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-cfsqd" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.546544 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.550368 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.551514 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-m7x54" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.555967 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.561551 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.563719 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.570274 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-rhhx8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.571089 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2v5t\" (UniqueName: \"kubernetes.io/projected/2a55fa83-c395-4ac2-bc2e-355ad48a4a95-kube-api-access-v2v5t\") pod \"nova-operator-controller-manager-74ffd97575-cln8g\" (UID: \"2a55fa83-c395-4ac2-bc2e-355ad48a4a95\") " pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.571130 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkbw7\" (UniqueName: \"kubernetes.io/projected/6c5084e4-b0e1-46fd-ae69-c0f2ede3db17-kube-api-access-fkbw7\") pod \"manila-operator-controller-manager-78b8f8fd84-8xrd7\" (UID: \"6c5084e4-b0e1-46fd-ae69-c0f2ede3db17\") " pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.571173 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrhjl\" (UniqueName: \"kubernetes.io/projected/6bcdd59a-9739-40e7-9625-3e56009dcbd7-kube-api-access-wrhjl\") pod \"neutron-operator-controller-manager-569695f6c5-pmk9t\" (UID: \"6bcdd59a-9739-40e7-9625-3e56009dcbd7\") " pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.571231 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xts7d\" (UniqueName: \"kubernetes.io/projected/56fb2890-7d29-452c-9f24-4aa20d977f0b-kube-api-access-xts7d\") pod \"mariadb-operator-controller-manager-7b88bfc995-tzn2s\" (UID: \"56fb2890-7d29-452c-9f24-4aa20d977f0b\") " pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.574262 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.574517 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.583424 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.587730 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.588884 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.590881 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hwctc" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.593141 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.602077 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkbw7\" (UniqueName: \"kubernetes.io/projected/6c5084e4-b0e1-46fd-ae69-c0f2ede3db17-kube-api-access-fkbw7\") pod \"manila-operator-controller-manager-78b8f8fd84-8xrd7\" (UID: \"6c5084e4-b0e1-46fd-ae69-c0f2ede3db17\") " pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.615590 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.616664 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.620329 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-r6fhd" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.621214 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.646979 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.647900 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.652845 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-dxp8c" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.676867 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.678160 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.678948 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcj29\" (UniqueName: \"kubernetes.io/projected/01a03f23-ead5-4a15-976f-4dda2622083b-kube-api-access-dcj29\") pod \"telemetry-operator-controller-manager-7db57dc8bf-5lbbt\" (UID: \"01a03f23-ead5-4a15-976f-4dda2622083b\") " pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.678981 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gm2t\" (UniqueName: \"kubernetes.io/projected/3e47cb1c-7f01-4b8d-904f-fed543678a02-kube-api-access-6gm2t\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679017 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfq7s\" (UniqueName: \"kubernetes.io/projected/e14198f0-3413-4350-bae5-33b23ceead05-kube-api-access-lfq7s\") pod \"placement-operator-controller-manager-7748d79f84-vmtx4\" (UID: \"e14198f0-3413-4350-bae5-33b23ceead05\") " pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679051 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2v5t\" (UniqueName: \"kubernetes.io/projected/2a55fa83-c395-4ac2-bc2e-355ad48a4a95-kube-api-access-v2v5t\") pod \"nova-operator-controller-manager-74ffd97575-cln8g\" (UID: \"2a55fa83-c395-4ac2-bc2e-355ad48a4a95\") " pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679135 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj5jn\" (UniqueName: \"kubernetes.io/projected/701902fe-7e51-44b6-923b-0a60c96d6707-kube-api-access-lj5jn\") pod \"ovn-operator-controller-manager-bf6d4f946-p9vts\" (UID: \"701902fe-7e51-44b6-923b-0a60c96d6707\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679172 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679202 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr8zg\" (UniqueName: \"kubernetes.io/projected/7df5397d-0c1f-46b4-8695-d80c752ca569-kube-api-access-dr8zg\") pod \"octavia-operator-controller-manager-7bf4858b78-fcd9x\" (UID: \"7df5397d-0c1f-46b4-8695-d80c752ca569\") " pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679232 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrhjl\" (UniqueName: \"kubernetes.io/projected/6bcdd59a-9739-40e7-9625-3e56009dcbd7-kube-api-access-wrhjl\") pod \"neutron-operator-controller-manager-569695f6c5-pmk9t\" (UID: \"6bcdd59a-9739-40e7-9625-3e56009dcbd7\") " pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679273 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjwcx\" (UniqueName: \"kubernetes.io/projected/909c9a87-2eb1-4a52-b86d-6d36524b1eb2-kube-api-access-gjwcx\") pod \"swift-operator-controller-manager-65596dbf77-9sfp8\" (UID: \"909c9a87-2eb1-4a52-b86d-6d36524b1eb2\") " pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679328 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zwgt\" (UniqueName: \"kubernetes.io/projected/5070c545-d4c0-46b3-afb9-c130dc982406-kube-api-access-7zwgt\") pod \"test-operator-controller-manager-6c866cfdcb-2mz97\" (UID: \"5070c545-d4c0-46b3-afb9-c130dc982406\") " pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.679359 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xts7d\" (UniqueName: \"kubernetes.io/projected/56fb2890-7d29-452c-9f24-4aa20d977f0b-kube-api-access-xts7d\") pod \"mariadb-operator-controller-manager-7b88bfc995-tzn2s\" (UID: \"56fb2890-7d29-452c-9f24-4aa20d977f0b\") " pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.731611 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.742048 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xts7d\" (UniqueName: \"kubernetes.io/projected/56fb2890-7d29-452c-9f24-4aa20d977f0b-kube-api-access-xts7d\") pod \"mariadb-operator-controller-manager-7b88bfc995-tzn2s\" (UID: \"56fb2890-7d29-452c-9f24-4aa20d977f0b\") " pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.748956 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrhjl\" (UniqueName: \"kubernetes.io/projected/6bcdd59a-9739-40e7-9625-3e56009dcbd7-kube-api-access-wrhjl\") pod \"neutron-operator-controller-manager-569695f6c5-pmk9t\" (UID: \"6bcdd59a-9739-40e7-9625-3e56009dcbd7\") " pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.756690 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2v5t\" (UniqueName: \"kubernetes.io/projected/2a55fa83-c395-4ac2-bc2e-355ad48a4a95-kube-api-access-v2v5t\") pod \"nova-operator-controller-manager-74ffd97575-cln8g\" (UID: \"2a55fa83-c395-4ac2-bc2e-355ad48a4a95\") " pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.772900 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780203 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gm2t\" (UniqueName: \"kubernetes.io/projected/3e47cb1c-7f01-4b8d-904f-fed543678a02-kube-api-access-6gm2t\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780246 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfq7s\" (UniqueName: \"kubernetes.io/projected/e14198f0-3413-4350-bae5-33b23ceead05-kube-api-access-lfq7s\") pod \"placement-operator-controller-manager-7748d79f84-vmtx4\" (UID: \"e14198f0-3413-4350-bae5-33b23ceead05\") " pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780279 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj5jn\" (UniqueName: \"kubernetes.io/projected/701902fe-7e51-44b6-923b-0a60c96d6707-kube-api-access-lj5jn\") pod \"ovn-operator-controller-manager-bf6d4f946-p9vts\" (UID: \"701902fe-7e51-44b6-923b-0a60c96d6707\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780300 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780322 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr8zg\" (UniqueName: \"kubernetes.io/projected/7df5397d-0c1f-46b4-8695-d80c752ca569-kube-api-access-dr8zg\") pod \"octavia-operator-controller-manager-7bf4858b78-fcd9x\" (UID: \"7df5397d-0c1f-46b4-8695-d80c752ca569\") " pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780350 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xv9n\" (UniqueName: \"kubernetes.io/projected/bea84175-0947-45e5-a635-b7d32a0442c6-kube-api-access-2xv9n\") pod \"watcher-operator-controller-manager-6476466c7c-lb4h8\" (UID: \"bea84175-0947-45e5-a635-b7d32a0442c6\") " pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780374 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjwcx\" (UniqueName: \"kubernetes.io/projected/909c9a87-2eb1-4a52-b86d-6d36524b1eb2-kube-api-access-gjwcx\") pod \"swift-operator-controller-manager-65596dbf77-9sfp8\" (UID: \"909c9a87-2eb1-4a52-b86d-6d36524b1eb2\") " pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780414 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zwgt\" (UniqueName: \"kubernetes.io/projected/5070c545-d4c0-46b3-afb9-c130dc982406-kube-api-access-7zwgt\") pod \"test-operator-controller-manager-6c866cfdcb-2mz97\" (UID: \"5070c545-d4c0-46b3-afb9-c130dc982406\") " pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.780444 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcj29\" (UniqueName: \"kubernetes.io/projected/01a03f23-ead5-4a15-976f-4dda2622083b-kube-api-access-dcj29\") pod \"telemetry-operator-controller-manager-7db57dc8bf-5lbbt\" (UID: \"01a03f23-ead5-4a15-976f-4dda2622083b\") " pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.780737 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.780806 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert podName:3e47cb1c-7f01-4b8d-904f-fed543678a02 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:38.280787765 +0000 UTC m=+857.422385542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert") pod "openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" (UID: "3e47cb1c-7f01-4b8d-904f-fed543678a02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.791811 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.799503 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj5jn\" (UniqueName: \"kubernetes.io/projected/701902fe-7e51-44b6-923b-0a60c96d6707-kube-api-access-lj5jn\") pod \"ovn-operator-controller-manager-bf6d4f946-p9vts\" (UID: \"701902fe-7e51-44b6-923b-0a60c96d6707\") " pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.801013 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjwcx\" (UniqueName: \"kubernetes.io/projected/909c9a87-2eb1-4a52-b86d-6d36524b1eb2-kube-api-access-gjwcx\") pod \"swift-operator-controller-manager-65596dbf77-9sfp8\" (UID: \"909c9a87-2eb1-4a52-b86d-6d36524b1eb2\") " pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.809285 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.811557 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zwgt\" (UniqueName: \"kubernetes.io/projected/5070c545-d4c0-46b3-afb9-c130dc982406-kube-api-access-7zwgt\") pod \"test-operator-controller-manager-6c866cfdcb-2mz97\" (UID: \"5070c545-d4c0-46b3-afb9-c130dc982406\") " pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.815906 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcj29\" (UniqueName: \"kubernetes.io/projected/01a03f23-ead5-4a15-976f-4dda2622083b-kube-api-access-dcj29\") pod \"telemetry-operator-controller-manager-7db57dc8bf-5lbbt\" (UID: \"01a03f23-ead5-4a15-976f-4dda2622083b\") " pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.817530 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.818227 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.818241 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.818597 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr8zg\" (UniqueName: \"kubernetes.io/projected/7df5397d-0c1f-46b4-8695-d80c752ca569-kube-api-access-dr8zg\") pod \"octavia-operator-controller-manager-7bf4858b78-fcd9x\" (UID: \"7df5397d-0c1f-46b4-8695-d80c752ca569\") " pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.818931 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.818971 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.819041 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.822744 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-29t4c" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.822769 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-slbwg" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.822835 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfq7s\" (UniqueName: \"kubernetes.io/projected/e14198f0-3413-4350-bae5-33b23ceead05-kube-api-access-lfq7s\") pod \"placement-operator-controller-manager-7748d79f84-vmtx4\" (UID: \"e14198f0-3413-4350-bae5-33b23ceead05\") " pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.822968 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.824599 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.829632 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gm2t\" (UniqueName: \"kubernetes.io/projected/3e47cb1c-7f01-4b8d-904f-fed543678a02-kube-api-access-6gm2t\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.830816 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.851865 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.852003 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.881418 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xv9n\" (UniqueName: \"kubernetes.io/projected/bea84175-0947-45e5-a635-b7d32a0442c6-kube-api-access-2xv9n\") pod \"watcher-operator-controller-manager-6476466c7c-lb4h8\" (UID: \"bea84175-0947-45e5-a635-b7d32a0442c6\") " pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.881551 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndrmm\" (UniqueName: \"kubernetes.io/projected/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-kube-api-access-ndrmm\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.881653 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knm5c\" (UniqueName: \"kubernetes.io/projected/a5e8d398-7976-4603-8409-304fa193f7f1-kube-api-access-knm5c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-g5nsq\" (UID: \"a5e8d398-7976-4603-8409-304fa193f7f1\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.881706 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.881775 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.904956 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xv9n\" (UniqueName: \"kubernetes.io/projected/bea84175-0947-45e5-a635-b7d32a0442c6-kube-api-access-2xv9n\") pod \"watcher-operator-controller-manager-6476466c7c-lb4h8\" (UID: \"bea84175-0947-45e5-a635-b7d32a0442c6\") " pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.926029 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.939917 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.979703 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.990164 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2"] Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.990833 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndrmm\" (UniqueName: \"kubernetes.io/projected/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-kube-api-access-ndrmm\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.990865 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.990911 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knm5c\" (UniqueName: \"kubernetes.io/projected/a5e8d398-7976-4603-8409-304fa193f7f1-kube-api-access-knm5c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-g5nsq\" (UID: \"a5e8d398-7976-4603-8409-304fa193f7f1\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.990932 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:37 crc kubenswrapper[4775]: I0127 11:34:37.990960 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.991306 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.991360 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:38.491345226 +0000 UTC m=+857.632943003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "webhook-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.992737 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.992788 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:38.492772855 +0000 UTC m=+857.634370632 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "metrics-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.992830 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:37 crc kubenswrapper[4775]: E0127 11:34:37.992851 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert podName:0da235e3-e76a-408f-8e0e-3cdd7ce76705 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:38.992844557 +0000 UTC m=+858.134442334 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert") pod "infra-operator-controller-manager-54ccf4f85d-d7vhk" (UID: "0da235e3-e76a-408f-8e0e-3cdd7ce76705") : secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.011329 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndrmm\" (UniqueName: \"kubernetes.io/projected/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-kube-api-access-ndrmm\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.023356 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knm5c\" (UniqueName: \"kubernetes.io/projected/a5e8d398-7976-4603-8409-304fa193f7f1-kube-api-access-knm5c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-g5nsq\" (UID: \"a5e8d398-7976-4603-8409-304fa193f7f1\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.192285 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.237664 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.240275 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.263052 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.271179 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.294147 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.294723 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.294774 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert podName:3e47cb1c-7f01-4b8d-904f-fed543678a02 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:39.294760261 +0000 UTC m=+858.436358038 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert") pod "openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" (UID: "3e47cb1c-7f01-4b8d-904f-fed543678a02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.475637 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" event={"ID":"f04fa2a0-7af2-439a-9169-6edf5be65b35","Type":"ContainerStarted","Data":"635cb70a23c2860502f533646fdb4d841561609d7d6d12757d03aea64d1f5c15"} Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.480543 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" event={"ID":"c31d5b06-1ad2-4914-96c1-e0f0b8c4974e","Type":"ContainerStarted","Data":"60e06bdfb4344d6653703814829342f16064f4e9f6ccc2ae239efee14eed5d21"} Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.496348 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.496935 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" event={"ID":"0cabb338-c4a1-41b4-abd6-d535b0e88406","Type":"ContainerStarted","Data":"9ff280c57a2ad9bfdcbb64fbf25906d01b4b4260b123d21e25aba910387259b1"} Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.503181 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.503245 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.503428 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.503503 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:39.503484862 +0000 UTC m=+858.645082639 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "webhook-server-cert" not found Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.503925 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.503962 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:39.503950145 +0000 UTC m=+858.645547922 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "metrics-server-cert" not found Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.545367 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" event={"ID":"04cbcc0c-4375-44f0-9461-b43492e9d95b","Type":"ContainerStarted","Data":"31114467e544a77d7142ca6664ad0354ae2be9de3f038de88002af551675944c"} Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.590939 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd"] Jan 27 11:34:38 crc kubenswrapper[4775]: W0127 11:34:38.593338 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod703a739a_6687_4324_b937_7d0efe7c143b.slice/crio-73c6e8ea67e58041f7bf63e1f0103ad7a8e0b4bdd083a7454f90fad557c5a6e4 WatchSource:0}: Error finding container 73c6e8ea67e58041f7bf63e1f0103ad7a8e0b4bdd083a7454f90fad557c5a6e4: Status 404 returned error can't find the container with id 73c6e8ea67e58041f7bf63e1f0103ad7a8e0b4bdd083a7454f90fad557c5a6e4 Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.717879 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z"] Jan 27 11:34:38 crc kubenswrapper[4775]: W0127 11:34:38.723035 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb296a3cd_1dc1_4511_af7a_7b1801e23e61.slice/crio-db061908966343139d0ca441bab58e2df1b17b0f1c4361cbfb94a5d486df2737 WatchSource:0}: Error finding container db061908966343139d0ca441bab58e2df1b17b0f1c4361cbfb94a5d486df2737: Status 404 returned error can't find the container with id db061908966343139d0ca441bab58e2df1b17b0f1c4361cbfb94a5d486df2737 Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.745042 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.770100 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.775847 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg"] Jan 27 11:34:38 crc kubenswrapper[4775]: W0127 11:34:38.781620 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e719fbd_ac18_4ae1_bac6_c42f1e081daa.slice/crio-b2e2dd5c0acb7d4766cfb25a7635776ed6f6b787167dd14e301ab9619bda78f1 WatchSource:0}: Error finding container b2e2dd5c0acb7d4766cfb25a7635776ed6f6b787167dd14e301ab9619bda78f1: Status 404 returned error can't find the container with id b2e2dd5c0acb7d4766cfb25a7635776ed6f6b787167dd14e301ab9619bda78f1 Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.882318 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.927610 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t"] Jan 27 11:34:38 crc kubenswrapper[4775]: W0127 11:34:38.931420 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod701902fe_7e51_44b6_923b_0a60c96d6707.slice/crio-cc1b9069ecc3964c9234c9be31a0a02608ff78354a4ff534224bfa477bba88d3 WatchSource:0}: Error finding container cc1b9069ecc3964c9234c9be31a0a02608ff78354a4ff534224bfa477bba88d3: Status 404 returned error can't find the container with id cc1b9069ecc3964c9234c9be31a0a02608ff78354a4ff534224bfa477bba88d3 Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.935977 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.940535 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x"] Jan 27 11:34:38 crc kubenswrapper[4775]: I0127 11:34:38.945488 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt"] Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.952320 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/neutron-operator@sha256:949870b350604b04062be6d035099ea54982d663328fe1604123fbadfad20a89,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wrhjl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-569695f6c5-pmk9t_openstack-operators(6bcdd59a-9739-40e7-9625-3e56009dcbd7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 11:34:38 crc kubenswrapper[4775]: E0127 11:34:38.953605 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" podUID="6bcdd59a-9739-40e7-9625-3e56009dcbd7" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.008713 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.008895 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.008943 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert podName:0da235e3-e76a-408f-8e0e-3cdd7ce76705 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:41.008930206 +0000 UTC m=+860.150527983 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert") pod "infra-operator-controller-manager-54ccf4f85d-d7vhk" (UID: "0da235e3-e76a-408f-8e0e-3cdd7ce76705") : secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.083177 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g"] Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.095369 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4"] Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.113311 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97"] Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.115475 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/nova-operator@sha256:9c0272b9043057e7fd740843e11c951ce93d5169298ed91aa8a60a702649f7cf,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v2v5t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-74ffd97575-cln8g_openstack-operators(2a55fa83-c395-4ac2-bc2e-355ad48a4a95): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.115750 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-knm5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-g5nsq_openstack-operators(a5e8d398-7976-4603-8409-304fa193f7f1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.116930 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" podUID="2a55fa83-c395-4ac2-bc2e-355ad48a4a95" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.117181 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" podUID="a5e8d398-7976-4603-8409-304fa193f7f1" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.117360 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/placement-operator@sha256:a40693d0a2ee7b50ff5b2bd339bc0ce358ccc16309e803e40d8b26e189a2b4c0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lfq7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-7748d79f84-vmtx4_openstack-operators(e14198f0-3413-4350-bae5-33b23ceead05): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.119567 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" podUID="e14198f0-3413-4350-bae5-33b23ceead05" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.123658 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq"] Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.140118 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:4e3d234c1398039c2593611f7b0fd2a6b284cafb1563e6737876a265b9af42b6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7zwgt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-6c866cfdcb-2mz97_openstack-operators(5070c545-d4c0-46b3-afb9-c130dc982406): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.142067 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" podUID="5070c545-d4c0-46b3-afb9-c130dc982406" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.143065 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8"] Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.371771 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.371903 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.371945 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert podName:3e47cb1c-7f01-4b8d-904f-fed543678a02 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:41.371932719 +0000 UTC m=+860.513530496 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert") pod "openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" (UID: "3e47cb1c-7f01-4b8d-904f-fed543678a02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.565764 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" event={"ID":"bea84175-0947-45e5-a635-b7d32a0442c6","Type":"ContainerStarted","Data":"de0d07df41b70f712f4495a5c9d10b333e9dfe524846a62c156a15afac6ffbf2"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.570906 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" event={"ID":"6bcdd59a-9739-40e7-9625-3e56009dcbd7","Type":"ContainerStarted","Data":"7e2a71620a96de3c0d78581b7109e8a2dd456a79ea833aa315e2fa5f03de02b1"} Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.575846 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/neutron-operator@sha256:949870b350604b04062be6d035099ea54982d663328fe1604123fbadfad20a89\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" podUID="6bcdd59a-9739-40e7-9625-3e56009dcbd7" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.579670 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" event={"ID":"6c5084e4-b0e1-46fd-ae69-c0f2ede3db17","Type":"ContainerStarted","Data":"6b0c0397fd7e8f15f43be57a5446f9a1ee7dd2e74b91a1285468ea9820c3f6d3"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.579921 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.579972 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.580157 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.580279 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.580359 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:41.58027714 +0000 UTC m=+860.721874917 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "webhook-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.580394 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:41.580387852 +0000 UTC m=+860.721985629 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "metrics-server-cert" not found Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.581161 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" event={"ID":"7df5397d-0c1f-46b4-8695-d80c752ca569","Type":"ContainerStarted","Data":"3f9cab5092ec3e0828421e5d32caa592c31724a0584923795e7daff3e6b69c0a"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.583172 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" event={"ID":"701902fe-7e51-44b6-923b-0a60c96d6707","Type":"ContainerStarted","Data":"cc1b9069ecc3964c9234c9be31a0a02608ff78354a4ff534224bfa477bba88d3"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.585193 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" event={"ID":"5070c545-d4c0-46b3-afb9-c130dc982406","Type":"ContainerStarted","Data":"05cb3cba9a63ad4223e70482b58fa699ab2d97e4e9f695bd16501c8a0c6e52e5"} Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.587357 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:4e3d234c1398039c2593611f7b0fd2a6b284cafb1563e6737876a265b9af42b6\\\"\"" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" podUID="5070c545-d4c0-46b3-afb9-c130dc982406" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.589089 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" event={"ID":"01a03f23-ead5-4a15-976f-4dda2622083b","Type":"ContainerStarted","Data":"a12385f6b6a14bb0efff0b292cd456d54d060b0815340f2f6c58a2d14f2fb4c2"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.590862 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" event={"ID":"a5e8d398-7976-4603-8409-304fa193f7f1","Type":"ContainerStarted","Data":"a127703edaf726d8d0a6759706c1413eadce88c4fadbc649f9181ce194d0df53"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.593751 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" event={"ID":"56fb2890-7d29-452c-9f24-4aa20d977f0b","Type":"ContainerStarted","Data":"2a8c518ad4c1694cfba4686074c46253f254793efa03210cd6c86e60c6b2b54a"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.595085 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" event={"ID":"2a55fa83-c395-4ac2-bc2e-355ad48a4a95","Type":"ContainerStarted","Data":"2df9ecf40ff4e7c5039b686e527d410aafce1e43f24d8d96f693a67d3f8b67f7"} Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.595553 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" podUID="a5e8d398-7976-4603-8409-304fa193f7f1" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.597438 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" event={"ID":"4e719fbd-ac18-4ae1-bac6-c42f1e081daa","Type":"ContainerStarted","Data":"b2e2dd5c0acb7d4766cfb25a7635776ed6f6b787167dd14e301ab9619bda78f1"} Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.597497 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/nova-operator@sha256:9c0272b9043057e7fd740843e11c951ce93d5169298ed91aa8a60a702649f7cf\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" podUID="2a55fa83-c395-4ac2-bc2e-355ad48a4a95" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.600427 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" event={"ID":"703a739a-6687-4324-b937-7d0efe7c143b","Type":"ContainerStarted","Data":"73c6e8ea67e58041f7bf63e1f0103ad7a8e0b4bdd083a7454f90fad557c5a6e4"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.602651 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" event={"ID":"dd9264fb-034f-46d3-8698-dcc6fc3470f6","Type":"ContainerStarted","Data":"0b2ec739a4678906269d7e56c4e3d973c36d50c3b9858ac7441988e3f88587d2"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.614138 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" event={"ID":"b296a3cd-1dc1-4511-af7a-7b1801e23e61","Type":"ContainerStarted","Data":"db061908966343139d0ca441bab58e2df1b17b0f1c4361cbfb94a5d486df2737"} Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.622758 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" event={"ID":"e14198f0-3413-4350-bae5-33b23ceead05","Type":"ContainerStarted","Data":"2b30435a88fba4d79f128cebe3008c8324ac25f329488b697a57319c3079f50a"} Jan 27 11:34:39 crc kubenswrapper[4775]: E0127 11:34:39.624595 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/placement-operator@sha256:a40693d0a2ee7b50ff5b2bd339bc0ce358ccc16309e803e40d8b26e189a2b4c0\\\"\"" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" podUID="e14198f0-3413-4350-bae5-33b23ceead05" Jan 27 11:34:39 crc kubenswrapper[4775]: I0127 11:34:39.626602 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" event={"ID":"909c9a87-2eb1-4a52-b86d-6d36524b1eb2","Type":"ContainerStarted","Data":"4c14ed35bd38ffe59dd1cbc70bb1c4834fc14d1087a80d81e0117f38a3c9eef3"} Jan 27 11:34:40 crc kubenswrapper[4775]: E0127 11:34:40.647202 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" podUID="a5e8d398-7976-4603-8409-304fa193f7f1" Jan 27 11:34:40 crc kubenswrapper[4775]: E0127 11:34:40.648093 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/neutron-operator@sha256:949870b350604b04062be6d035099ea54982d663328fe1604123fbadfad20a89\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" podUID="6bcdd59a-9739-40e7-9625-3e56009dcbd7" Jan 27 11:34:40 crc kubenswrapper[4775]: E0127 11:34:40.648855 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/placement-operator@sha256:a40693d0a2ee7b50ff5b2bd339bc0ce358ccc16309e803e40d8b26e189a2b4c0\\\"\"" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" podUID="e14198f0-3413-4350-bae5-33b23ceead05" Jan 27 11:34:40 crc kubenswrapper[4775]: E0127 11:34:40.648924 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/nova-operator@sha256:9c0272b9043057e7fd740843e11c951ce93d5169298ed91aa8a60a702649f7cf\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" podUID="2a55fa83-c395-4ac2-bc2e-355ad48a4a95" Jan 27 11:34:40 crc kubenswrapper[4775]: E0127 11:34:40.649361 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:4e3d234c1398039c2593611f7b0fd2a6b284cafb1563e6737876a265b9af42b6\\\"\"" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" podUID="5070c545-d4c0-46b3-afb9-c130dc982406" Jan 27 11:34:41 crc kubenswrapper[4775]: I0127 11:34:41.100462 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.100653 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.100733 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert podName:0da235e3-e76a-408f-8e0e-3cdd7ce76705 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:45.100714373 +0000 UTC m=+864.242312150 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert") pod "infra-operator-controller-manager-54ccf4f85d-d7vhk" (UID: "0da235e3-e76a-408f-8e0e-3cdd7ce76705") : secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:41 crc kubenswrapper[4775]: I0127 11:34:41.405222 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.405360 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.405462 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert podName:3e47cb1c-7f01-4b8d-904f-fed543678a02 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:45.405432184 +0000 UTC m=+864.547029961 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert") pod "openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" (UID: "3e47cb1c-7f01-4b8d-904f-fed543678a02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:41 crc kubenswrapper[4775]: I0127 11:34:41.607960 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:41 crc kubenswrapper[4775]: I0127 11:34:41.608013 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.608247 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.608335 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:45.608316775 +0000 UTC m=+864.749914552 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "webhook-server-cert" not found Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.608691 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 11:34:41 crc kubenswrapper[4775]: E0127 11:34:41.608730 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:45.608721896 +0000 UTC m=+864.750319663 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "metrics-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: I0127 11:34:45.154269 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.154428 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.154995 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert podName:0da235e3-e76a-408f-8e0e-3cdd7ce76705 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:53.154956783 +0000 UTC m=+872.296554600 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert") pod "infra-operator-controller-manager-54ccf4f85d-d7vhk" (UID: "0da235e3-e76a-408f-8e0e-3cdd7ce76705") : secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: I0127 11:34:45.459952 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.460091 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.460211 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert podName:3e47cb1c-7f01-4b8d-904f-fed543678a02 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:53.460197819 +0000 UTC m=+872.601795596 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert") pod "openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" (UID: "3e47cb1c-7f01-4b8d-904f-fed543678a02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: I0127 11:34:45.662093 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:45 crc kubenswrapper[4775]: I0127 11:34:45.662273 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.662291 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.662414 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.662480 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:53.662462324 +0000 UTC m=+872.804060101 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "metrics-server-cert" not found Jan 27 11:34:45 crc kubenswrapper[4775]: E0127 11:34:45.662970 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:34:53.662958667 +0000 UTC m=+872.804556444 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "webhook-server-cert" not found Jan 27 11:34:52 crc kubenswrapper[4775]: E0127 11:34:52.735441 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/rh-ee-vfisarov/designate-operator@sha256:9a27f561c9f23884b67f4fab9c8d2615b46cf4d324003a623470aa85771187d9" Jan 27 11:34:52 crc kubenswrapper[4775]: E0127 11:34:52.736481 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/designate-operator@sha256:9a27f561c9f23884b67f4fab9c8d2615b46cf4d324003a623470aa85771187d9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6nr5q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-76d4d5b8f9-dvj9s_openstack-operators(c31d5b06-1ad2-4914-96c1-e0f0b8c4974e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:34:52 crc kubenswrapper[4775]: E0127 11:34:52.737712 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" podUID="c31d5b06-1ad2-4914-96c1-e0f0b8c4974e" Jan 27 11:34:53 crc kubenswrapper[4775]: I0127 11:34:53.181666 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.181812 4775 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.181862 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert podName:0da235e3-e76a-408f-8e0e-3cdd7ce76705 nodeName:}" failed. No retries permitted until 2026-01-27 11:35:09.181848317 +0000 UTC m=+888.323446094 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert") pod "infra-operator-controller-manager-54ccf4f85d-d7vhk" (UID: "0da235e3-e76a-408f-8e0e-3cdd7ce76705") : secret "infra-operator-webhook-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.367740 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/rh-ee-vfisarov/telemetry-operator@sha256:578ea6a6c68040cb54e0160462dc2b97226594621a5f441fa1d58f429cf0e010" Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.367904 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/telemetry-operator@sha256:578ea6a6c68040cb54e0160462dc2b97226594621a5f441fa1d58f429cf0e010,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dcj29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7db57dc8bf-5lbbt_openstack-operators(01a03f23-ead5-4a15-976f-4dda2622083b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.369089 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" podUID="01a03f23-ead5-4a15-976f-4dda2622083b" Jan 27 11:34:53 crc kubenswrapper[4775]: I0127 11:34:53.485275 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.485443 4775 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.485500 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert podName:3e47cb1c-7f01-4b8d-904f-fed543678a02 nodeName:}" failed. No retries permitted until 2026-01-27 11:35:09.48548748 +0000 UTC m=+888.627085257 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert") pod "openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" (UID: "3e47cb1c-7f01-4b8d-904f-fed543678a02") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: I0127 11:34:53.688322 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:53 crc kubenswrapper[4775]: I0127 11:34:53.688723 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.688410 4775 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.688917 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:35:09.688898425 +0000 UTC m=+888.830496202 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "metrics-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.688863 4775 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.689078 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs podName:2ecfe007-a4bf-4c31-bc83-36f4c5f00815 nodeName:}" failed. No retries permitted until 2026-01-27 11:35:09.689067599 +0000 UTC m=+888.830665376 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs") pod "openstack-operator-controller-manager-76958f4d87-8js8k" (UID: "2ecfe007-a4bf-4c31-bc83-36f4c5f00815") : secret "webhook-server-cert" not found Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.733746 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/telemetry-operator@sha256:578ea6a6c68040cb54e0160462dc2b97226594621a5f441fa1d58f429cf0e010\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" podUID="01a03f23-ead5-4a15-976f-4dda2622083b" Jan 27 11:34:53 crc kubenswrapper[4775]: E0127 11:34:53.733846 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/designate-operator@sha256:9a27f561c9f23884b67f4fab9c8d2615b46cf4d324003a623470aa85771187d9\\\"\"" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" podUID="c31d5b06-1ad2-4914-96c1-e0f0b8c4974e" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.040338 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/rh-ee-vfisarov/watcher-operator@sha256:611e4fb8bf6cd263664ccb437637105fba633ba8f701c228fd525a7a7b3c8d74" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.040537 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/watcher-operator@sha256:611e4fb8bf6cd263664ccb437637105fba633ba8f701c228fd525a7a7b3c8d74,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2xv9n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6476466c7c-lb4h8_openstack-operators(bea84175-0947-45e5-a635-b7d32a0442c6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.042333 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" podUID="bea84175-0947-45e5-a635-b7d32a0442c6" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.586267 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.586473 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lj5jn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bf6d4f946-p9vts_openstack-operators(701902fe-7e51-44b6-923b-0a60c96d6707): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.587673 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" podUID="701902fe-7e51-44b6-923b-0a60c96d6707" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.739563 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/watcher-operator@sha256:611e4fb8bf6cd263664ccb437637105fba633ba8f701c228fd525a7a7b3c8d74\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" podUID="bea84175-0947-45e5-a635-b7d32a0442c6" Jan 27 11:34:54 crc kubenswrapper[4775]: E0127 11:34:54.740168 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" podUID="701902fe-7e51-44b6-923b-0a60c96d6707" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.193099 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/rh-ee-vfisarov/octavia-operator@sha256:c71c081c53239338b69dc68bde59707ecafa147c81489fd755b82a9f1af402bd" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.193283 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/octavia-operator@sha256:c71c081c53239338b69dc68bde59707ecafa147c81489fd755b82a9f1af402bd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dr8zg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7bf4858b78-fcd9x_openstack-operators(7df5397d-0c1f-46b4-8695-d80c752ca569): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.194538 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" podUID="7df5397d-0c1f-46b4-8695-d80c752ca569" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.637900 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/rh-ee-vfisarov/swift-operator@sha256:018ae1352a061ad22a0d4ac5764eb7e19cf5a1d6c2e554f61ae0bd82ebe62e29" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.638063 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/swift-operator@sha256:018ae1352a061ad22a0d4ac5764eb7e19cf5a1d6c2e554f61ae0bd82ebe62e29,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gjwcx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-65596dbf77-9sfp8_openstack-operators(909c9a87-2eb1-4a52-b86d-6d36524b1eb2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.639252 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" podUID="909c9a87-2eb1-4a52-b86d-6d36524b1eb2" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.745151 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/swift-operator@sha256:018ae1352a061ad22a0d4ac5764eb7e19cf5a1d6c2e554f61ae0bd82ebe62e29\\\"\"" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" podUID="909c9a87-2eb1-4a52-b86d-6d36524b1eb2" Jan 27 11:34:55 crc kubenswrapper[4775]: E0127 11:34:55.745219 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/octavia-operator@sha256:c71c081c53239338b69dc68bde59707ecafa147c81489fd755b82a9f1af402bd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" podUID="7df5397d-0c1f-46b4-8695-d80c752ca569" Jan 27 11:34:56 crc kubenswrapper[4775]: E0127 11:34:56.181964 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/rh-ee-vfisarov/keystone-operator@sha256:3f07fd90b18820601ae78f45a9fbef53bf9e3ed131d5cfa1d424ae0145862dd6" Jan 27 11:34:56 crc kubenswrapper[4775]: E0127 11:34:56.182472 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/rh-ee-vfisarov/keystone-operator@sha256:3f07fd90b18820601ae78f45a9fbef53bf9e3ed131d5cfa1d424ae0145862dd6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k5djl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-78f8b7b89c-2wqgg_openstack-operators(4e719fbd-ac18-4ae1-bac6-c42f1e081daa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:34:56 crc kubenswrapper[4775]: E0127 11:34:56.183808 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" podUID="4e719fbd-ac18-4ae1-bac6-c42f1e081daa" Jan 27 11:34:56 crc kubenswrapper[4775]: E0127 11:34:56.751240 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/rh-ee-vfisarov/keystone-operator@sha256:3f07fd90b18820601ae78f45a9fbef53bf9e3ed131d5cfa1d424ae0145862dd6\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" podUID="4e719fbd-ac18-4ae1-bac6-c42f1e081daa" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.774006 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" event={"ID":"dd9264fb-034f-46d3-8698-dcc6fc3470f6","Type":"ContainerStarted","Data":"548593cc29a876f9222f1632cc88a4434c35b1898d79274985cccf44b91c7fe6"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.774633 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.775242 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" event={"ID":"0cabb338-c4a1-41b4-abd6-d535b0e88406","Type":"ContainerStarted","Data":"b00b005894f23ad2c865391e2949c5760bc0be62a93c590e72b544cb0fe412cf"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.775376 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.776772 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" event={"ID":"2a55fa83-c395-4ac2-bc2e-355ad48a4a95","Type":"ContainerStarted","Data":"e13af6ea8377cebc5d591f7e6fbe9c7804b94d0e73b83409bf1812ec8449b5b0"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.776909 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.778536 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" event={"ID":"6bcdd59a-9739-40e7-9625-3e56009dcbd7","Type":"ContainerStarted","Data":"94b0270e1c432c8f8499a7de0d29a99be4a4a802962461ad697581cdeee69138"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.778716 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.780124 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" event={"ID":"04cbcc0c-4375-44f0-9461-b43492e9d95b","Type":"ContainerStarted","Data":"7dd9730a8cbf8f49919b51773eaec381e7159052fa47e3be7cc62ecc6c641081"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.780246 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.781466 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" event={"ID":"e14198f0-3413-4350-bae5-33b23ceead05","Type":"ContainerStarted","Data":"0e67793155549f91af8a8c7045fdc4a09671f113a8b99df1e493fd31d9c32325"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.781595 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.782668 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" event={"ID":"f04fa2a0-7af2-439a-9169-6edf5be65b35","Type":"ContainerStarted","Data":"7528a4bdfcd62ca8291923b4dc41b3a10a858b5e7b24dae765425ae4d120c882"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.782792 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.784057 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" event={"ID":"6c5084e4-b0e1-46fd-ae69-c0f2ede3db17","Type":"ContainerStarted","Data":"e7e9dd09cbb8f84479bc5dfa684ca55126249450ab3f0b79dc594b1a37c92c49"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.784104 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.785274 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" event={"ID":"703a739a-6687-4324-b937-7d0efe7c143b","Type":"ContainerStarted","Data":"2846fde5c67d3282993a541e7b8228d5d80f8035ece155d120aa7b92c788f5d7"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.785376 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.786442 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" event={"ID":"56fb2890-7d29-452c-9f24-4aa20d977f0b","Type":"ContainerStarted","Data":"0babafb3000afa84eab9b817fec9baae925504384ba783e0602178d7974538d5"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.786552 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.787736 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" event={"ID":"5070c545-d4c0-46b3-afb9-c130dc982406","Type":"ContainerStarted","Data":"061d59c933ee41017ec8a52ab83bc5aa3518bc1c0de172d74a18d409a54ec297"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.787846 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.789006 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" event={"ID":"b296a3cd-1dc1-4511-af7a-7b1801e23e61","Type":"ContainerStarted","Data":"3adb6b812f62ead77633d9d24d3690032be20991d85b12e9a8e6cdfffa621a10"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.789072 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.790502 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" event={"ID":"a5e8d398-7976-4603-8409-304fa193f7f1","Type":"ContainerStarted","Data":"a4d4be1769df7e26ecb3ccbd864663e043aa4f6bb7a5dc038b9f62290c7f1e85"} Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.799028 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" podStartSLOduration=5.186058543 podStartE2EDuration="23.799013902s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.545830289 +0000 UTC m=+857.687428066" lastFinishedPulling="2026-01-27 11:34:57.158785648 +0000 UTC m=+876.300383425" observedRunningTime="2026-01-27 11:35:00.795761914 +0000 UTC m=+879.937359691" watchObservedRunningTime="2026-01-27 11:35:00.799013902 +0000 UTC m=+879.940611679" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.827665 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" podStartSLOduration=5.740964698 podStartE2EDuration="24.827642755s" podCreationTimestamp="2026-01-27 11:34:36 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.072313257 +0000 UTC m=+857.213911034" lastFinishedPulling="2026-01-27 11:34:57.158991324 +0000 UTC m=+876.300589091" observedRunningTime="2026-01-27 11:35:00.822539225 +0000 UTC m=+879.964137032" watchObservedRunningTime="2026-01-27 11:35:00.827642755 +0000 UTC m=+879.969240532" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.847815 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" podStartSLOduration=5.038539904 podStartE2EDuration="23.847795315s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.349153577 +0000 UTC m=+857.490751354" lastFinishedPulling="2026-01-27 11:34:57.158408988 +0000 UTC m=+876.300006765" observedRunningTime="2026-01-27 11:35:00.843946249 +0000 UTC m=+879.985544026" watchObservedRunningTime="2026-01-27 11:35:00.847795315 +0000 UTC m=+879.989393092" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.864720 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-g5nsq" podStartSLOduration=3.151021835 podStartE2EDuration="23.864704726s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:39.115532837 +0000 UTC m=+858.257130614" lastFinishedPulling="2026-01-27 11:34:59.829215688 +0000 UTC m=+878.970813505" observedRunningTime="2026-01-27 11:35:00.863655958 +0000 UTC m=+880.005253735" watchObservedRunningTime="2026-01-27 11:35:00.864704726 +0000 UTC m=+880.006302503" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.881536 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" podStartSLOduration=3.038736259 podStartE2EDuration="23.881522806s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.952202596 +0000 UTC m=+858.093800363" lastFinishedPulling="2026-01-27 11:34:59.794989113 +0000 UTC m=+878.936586910" observedRunningTime="2026-01-27 11:35:00.87728024 +0000 UTC m=+880.018878027" watchObservedRunningTime="2026-01-27 11:35:00.881522806 +0000 UTC m=+880.023120583" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.895759 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" podStartSLOduration=6.565429795 podStartE2EDuration="24.895744764s" podCreationTimestamp="2026-01-27 11:34:36 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.324549386 +0000 UTC m=+857.466147163" lastFinishedPulling="2026-01-27 11:34:56.654864355 +0000 UTC m=+875.796462132" observedRunningTime="2026-01-27 11:35:00.894225913 +0000 UTC m=+880.035823690" watchObservedRunningTime="2026-01-27 11:35:00.895744764 +0000 UTC m=+880.037342541" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.918783 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" podStartSLOduration=5.486235222 podStartE2EDuration="23.918751193s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.726211885 +0000 UTC m=+857.867809662" lastFinishedPulling="2026-01-27 11:34:57.158727856 +0000 UTC m=+876.300325633" observedRunningTime="2026-01-27 11:35:00.915933166 +0000 UTC m=+880.057530953" watchObservedRunningTime="2026-01-27 11:35:00.918751193 +0000 UTC m=+880.060348960" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.938131 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" podStartSLOduration=4.020582015 podStartE2EDuration="23.938111012s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.751892216 +0000 UTC m=+857.893489993" lastFinishedPulling="2026-01-27 11:34:58.669421173 +0000 UTC m=+877.811018990" observedRunningTime="2026-01-27 11:35:00.937863724 +0000 UTC m=+880.079461511" watchObservedRunningTime="2026-01-27 11:35:00.938111012 +0000 UTC m=+880.079708789" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.957741 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" podStartSLOduration=3.341607861 podStartE2EDuration="23.957717337s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:39.117039848 +0000 UTC m=+858.258637625" lastFinishedPulling="2026-01-27 11:34:59.733149284 +0000 UTC m=+878.874747101" observedRunningTime="2026-01-27 11:35:00.954623402 +0000 UTC m=+880.096221179" watchObservedRunningTime="2026-01-27 11:35:00.957717337 +0000 UTC m=+880.099315114" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.994345 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" podStartSLOduration=3.341953041 podStartE2EDuration="23.994327397s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:39.115041863 +0000 UTC m=+858.256639640" lastFinishedPulling="2026-01-27 11:34:59.767416179 +0000 UTC m=+878.909013996" observedRunningTime="2026-01-27 11:35:00.979888512 +0000 UTC m=+880.121486289" watchObservedRunningTime="2026-01-27 11:35:00.994327397 +0000 UTC m=+880.135925174" Jan 27 11:35:00 crc kubenswrapper[4775]: I0127 11:35:00.996101 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" podStartSLOduration=3.366653525 podStartE2EDuration="23.996092465s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:39.139923773 +0000 UTC m=+858.281521550" lastFinishedPulling="2026-01-27 11:34:59.769362673 +0000 UTC m=+878.910960490" observedRunningTime="2026-01-27 11:35:00.992817886 +0000 UTC m=+880.134415673" watchObservedRunningTime="2026-01-27 11:35:00.996092465 +0000 UTC m=+880.137690242" Jan 27 11:35:01 crc kubenswrapper[4775]: I0127 11:35:01.011152 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" podStartSLOduration=6.13756365 podStartE2EDuration="24.011135686s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.78131635 +0000 UTC m=+857.922914127" lastFinishedPulling="2026-01-27 11:34:56.654888386 +0000 UTC m=+875.796486163" observedRunningTime="2026-01-27 11:35:01.007591959 +0000 UTC m=+880.149189736" watchObservedRunningTime="2026-01-27 11:35:01.011135686 +0000 UTC m=+880.152733463" Jan 27 11:35:01 crc kubenswrapper[4775]: I0127 11:35:01.030241 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" podStartSLOduration=4.463669585 podStartE2EDuration="24.030221707s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.595015452 +0000 UTC m=+857.736613229" lastFinishedPulling="2026-01-27 11:34:58.161567574 +0000 UTC m=+877.303165351" observedRunningTime="2026-01-27 11:35:01.025864818 +0000 UTC m=+880.167462595" watchObservedRunningTime="2026-01-27 11:35:01.030221707 +0000 UTC m=+880.171819484" Jan 27 11:35:06 crc kubenswrapper[4775]: I0127 11:35:06.828963 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" event={"ID":"c31d5b06-1ad2-4914-96c1-e0f0b8c4974e","Type":"ContainerStarted","Data":"967e70286a376c5cff07b0e24d07f343dd672d3fde789d839386b64ef515fb7f"} Jan 27 11:35:06 crc kubenswrapper[4775]: I0127 11:35:06.829850 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" Jan 27 11:35:06 crc kubenswrapper[4775]: I0127 11:35:06.831317 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" event={"ID":"01a03f23-ead5-4a15-976f-4dda2622083b","Type":"ContainerStarted","Data":"243a0b9063d35bc448b8e7118de9b12c1c5c081d0843da8c2d20773d44ae4d89"} Jan 27 11:35:06 crc kubenswrapper[4775]: I0127 11:35:06.831603 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" Jan 27 11:35:06 crc kubenswrapper[4775]: I0127 11:35:06.841339 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" podStartSLOduration=1.897954586 podStartE2EDuration="29.841324048s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.349445156 +0000 UTC m=+857.491042933" lastFinishedPulling="2026-01-27 11:35:06.292814618 +0000 UTC m=+885.434412395" observedRunningTime="2026-01-27 11:35:06.841231346 +0000 UTC m=+885.982829163" watchObservedRunningTime="2026-01-27 11:35:06.841324048 +0000 UTC m=+885.982921825" Jan 27 11:35:06 crc kubenswrapper[4775]: I0127 11:35:06.872690 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" podStartSLOduration=2.683120188 podStartE2EDuration="29.872667264s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.950626604 +0000 UTC m=+858.092224381" lastFinishedPulling="2026-01-27 11:35:06.14017368 +0000 UTC m=+885.281771457" observedRunningTime="2026-01-27 11:35:06.862561798 +0000 UTC m=+886.004159565" watchObservedRunningTime="2026-01-27 11:35:06.872667264 +0000 UTC m=+886.014265041" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.403038 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-75b8f798ff-t29z2" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.427266 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5fdc687f5-9wc4j" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.470387 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-658dd65b86-jp5c7" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.497327 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84d5bb46b-cvp5b" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.529066 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-7f5ddd8d7b-58qnd" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.577092 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-58865f87b4-s2l5z" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.736195 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-78b8f8fd84-8xrd7" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.775081 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b88bfc995-tzn2s" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.795378 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-569695f6c5-pmk9t" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.854754 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74ffd97575-cln8g" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.855629 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-6c866cfdcb-2mz97" Jan 27 11:35:07 crc kubenswrapper[4775]: I0127 11:35:07.985356 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-7748d79f84-vmtx4" Jan 27 11:35:08 crc kubenswrapper[4775]: I0127 11:35:08.866055 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" event={"ID":"4e719fbd-ac18-4ae1-bac6-c42f1e081daa","Type":"ContainerStarted","Data":"772ec2f550b879ad4921a83a00a00bb60ff43cf4936267420136d3c27fcc9202"} Jan 27 11:35:08 crc kubenswrapper[4775]: I0127 11:35:08.866668 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" Jan 27 11:35:08 crc kubenswrapper[4775]: I0127 11:35:08.894184 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" podStartSLOduration=2.440005498 podStartE2EDuration="31.894165951s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.783868559 +0000 UTC m=+857.925466336" lastFinishedPulling="2026-01-27 11:35:08.238029022 +0000 UTC m=+887.379626789" observedRunningTime="2026-01-27 11:35:08.88166509 +0000 UTC m=+888.023262907" watchObservedRunningTime="2026-01-27 11:35:08.894165951 +0000 UTC m=+888.035763728" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.217662 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.225256 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0da235e3-e76a-408f-8e0e-3cdd7ce76705-cert\") pod \"infra-operator-controller-manager-54ccf4f85d-d7vhk\" (UID: \"0da235e3-e76a-408f-8e0e-3cdd7ce76705\") " pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.446442 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.521507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.532145 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3e47cb1c-7f01-4b8d-904f-fed543678a02-cert\") pod \"openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8\" (UID: \"3e47cb1c-7f01-4b8d-904f-fed543678a02\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.724294 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.724584 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.732365 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-metrics-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.732945 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2ecfe007-a4bf-4c31-bc83-36f4c5f00815-webhook-certs\") pod \"openstack-operator-controller-manager-76958f4d87-8js8k\" (UID: \"2ecfe007-a4bf-4c31-bc83-36f4c5f00815\") " pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.760822 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:35:09 crc kubenswrapper[4775]: I0127 11:35:09.883905 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk"] Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.017483 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.024191 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8"] Jan 27 11:35:10 crc kubenswrapper[4775]: W0127 11:35:10.033652 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e47cb1c_7f01_4b8d_904f_fed543678a02.slice/crio-74be6881a5a8b8d9b79e8653ce9ca4787fab7c38b28f8c5b3e7fd6395fe57f9b WatchSource:0}: Error finding container 74be6881a5a8b8d9b79e8653ce9ca4787fab7c38b28f8c5b3e7fd6395fe57f9b: Status 404 returned error can't find the container with id 74be6881a5a8b8d9b79e8653ce9ca4787fab7c38b28f8c5b3e7fd6395fe57f9b Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.224953 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k"] Jan 27 11:35:10 crc kubenswrapper[4775]: W0127 11:35:10.234631 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ecfe007_a4bf_4c31_bc83_36f4c5f00815.slice/crio-788c6c64ec3b818946b6eea66278745be8afe4e509e78d684645f20d24cebcaf WatchSource:0}: Error finding container 788c6c64ec3b818946b6eea66278745be8afe4e509e78d684645f20d24cebcaf: Status 404 returned error can't find the container with id 788c6c64ec3b818946b6eea66278745be8afe4e509e78d684645f20d24cebcaf Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.892246 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" event={"ID":"bea84175-0947-45e5-a635-b7d32a0442c6","Type":"ContainerStarted","Data":"29ad418aeb425efe4b3c4f68a14c6fcf53bcf8fa12afa24c93e327cb1555d94e"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.892492 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.895603 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" event={"ID":"3e47cb1c-7f01-4b8d-904f-fed543678a02","Type":"ContainerStarted","Data":"74be6881a5a8b8d9b79e8653ce9ca4787fab7c38b28f8c5b3e7fd6395fe57f9b"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.926737 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" podStartSLOduration=2.706405844 podStartE2EDuration="33.926721691s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:39.128522802 +0000 UTC m=+858.270120579" lastFinishedPulling="2026-01-27 11:35:10.348838649 +0000 UTC m=+889.490436426" observedRunningTime="2026-01-27 11:35:10.923873533 +0000 UTC m=+890.065471310" watchObservedRunningTime="2026-01-27 11:35:10.926721691 +0000 UTC m=+890.068319468" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.931699 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" event={"ID":"909c9a87-2eb1-4a52-b86d-6d36524b1eb2","Type":"ContainerStarted","Data":"8cf2c4e631781a2b74c75ba33fe1e4854d82020b9e1395e4d25b4627fff1b91b"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.932630 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.958391 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" podStartSLOduration=2.502451154 podStartE2EDuration="33.958375485s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.89556646 +0000 UTC m=+858.037164237" lastFinishedPulling="2026-01-27 11:35:10.351490771 +0000 UTC m=+889.493088568" observedRunningTime="2026-01-27 11:35:10.95711812 +0000 UTC m=+890.098715897" watchObservedRunningTime="2026-01-27 11:35:10.958375485 +0000 UTC m=+890.099973262" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.960633 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" event={"ID":"2ecfe007-a4bf-4c31-bc83-36f4c5f00815","Type":"ContainerStarted","Data":"5162c94e4d7380d09ba424c0ef305ff23a4a24e167a2a891157a7753e88d2421"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.960666 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" event={"ID":"2ecfe007-a4bf-4c31-bc83-36f4c5f00815","Type":"ContainerStarted","Data":"788c6c64ec3b818946b6eea66278745be8afe4e509e78d684645f20d24cebcaf"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.961416 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.973479 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" event={"ID":"7df5397d-0c1f-46b4-8695-d80c752ca569","Type":"ContainerStarted","Data":"85b6560e297a54acf133a7e81757a18f504d9a16cff2cdaa7bb153806099d042"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.974132 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.980122 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" event={"ID":"701902fe-7e51-44b6-923b-0a60c96d6707","Type":"ContainerStarted","Data":"e6fc78a1889c65e6fd8d1767fd2c7f07389fd3e7b8e07261244947755adca48e"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.980651 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.984998 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" event={"ID":"0da235e3-e76a-408f-8e0e-3cdd7ce76705","Type":"ContainerStarted","Data":"747ec97e4507215c75ea2ae6bd21b95c09cf3cdea75c831ed344535fae7d07a9"} Jan 27 11:35:10 crc kubenswrapper[4775]: I0127 11:35:10.998857 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" podStartSLOduration=33.99884029 podStartE2EDuration="33.99884029s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:35:10.99519671 +0000 UTC m=+890.136794487" watchObservedRunningTime="2026-01-27 11:35:10.99884029 +0000 UTC m=+890.140438067" Jan 27 11:35:11 crc kubenswrapper[4775]: I0127 11:35:11.064372 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" podStartSLOduration=2.773367003 podStartE2EDuration="34.064357639s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.933764413 +0000 UTC m=+858.075362190" lastFinishedPulling="2026-01-27 11:35:10.224755059 +0000 UTC m=+889.366352826" observedRunningTime="2026-01-27 11:35:11.032641653 +0000 UTC m=+890.174239430" watchObservedRunningTime="2026-01-27 11:35:11.064357639 +0000 UTC m=+890.205955416" Jan 27 11:35:11 crc kubenswrapper[4775]: I0127 11:35:11.769274 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" podStartSLOduration=3.440730008 podStartE2EDuration="34.769237619s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:34:38.950613013 +0000 UTC m=+858.092210790" lastFinishedPulling="2026-01-27 11:35:10.279120604 +0000 UTC m=+889.420718401" observedRunningTime="2026-01-27 11:35:11.073712245 +0000 UTC m=+890.215310022" watchObservedRunningTime="2026-01-27 11:35:11.769237619 +0000 UTC m=+890.910835416" Jan 27 11:35:13 crc kubenswrapper[4775]: I0127 11:35:13.009569 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" event={"ID":"0da235e3-e76a-408f-8e0e-3cdd7ce76705","Type":"ContainerStarted","Data":"5dbb2248bb9af2ed3e66020381d43a322be0784e17193bbc6fc65a4e3101b6aa"} Jan 27 11:35:13 crc kubenswrapper[4775]: I0127 11:35:13.009863 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:35:13 crc kubenswrapper[4775]: I0127 11:35:13.012129 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" event={"ID":"3e47cb1c-7f01-4b8d-904f-fed543678a02","Type":"ContainerStarted","Data":"ef164437dd6589a87691e5731309ec83dd85a93f5d9cf05212a2c7423583c4b4"} Jan 27 11:35:13 crc kubenswrapper[4775]: I0127 11:35:13.012223 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:35:13 crc kubenswrapper[4775]: I0127 11:35:13.029311 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" podStartSLOduration=33.124172622 podStartE2EDuration="36.029296301s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:35:09.897954485 +0000 UTC m=+889.039552262" lastFinishedPulling="2026-01-27 11:35:12.803078164 +0000 UTC m=+891.944675941" observedRunningTime="2026-01-27 11:35:13.026911177 +0000 UTC m=+892.168508954" watchObservedRunningTime="2026-01-27 11:35:13.029296301 +0000 UTC m=+892.170894078" Jan 27 11:35:13 crc kubenswrapper[4775]: I0127 11:35:13.059585 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" podStartSLOduration=33.287000699000004 podStartE2EDuration="36.059564648s" podCreationTimestamp="2026-01-27 11:34:37 +0000 UTC" firstStartedPulling="2026-01-27 11:35:10.035694006 +0000 UTC m=+889.177291783" lastFinishedPulling="2026-01-27 11:35:12.808257955 +0000 UTC m=+891.949855732" observedRunningTime="2026-01-27 11:35:13.053512953 +0000 UTC m=+892.195110730" watchObservedRunningTime="2026-01-27 11:35:13.059564648 +0000 UTC m=+892.201162425" Jan 27 11:35:17 crc kubenswrapper[4775]: I0127 11:35:17.458438 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-76d4d5b8f9-dvj9s" Jan 27 11:35:17 crc kubenswrapper[4775]: I0127 11:35:17.680791 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-78f8b7b89c-2wqgg" Jan 27 11:35:17 crc kubenswrapper[4775]: I0127 11:35:17.813961 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-65596dbf77-9sfp8" Jan 27 11:35:17 crc kubenswrapper[4775]: I0127 11:35:17.838769 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7db57dc8bf-5lbbt" Jan 27 11:35:17 crc kubenswrapper[4775]: I0127 11:35:17.929619 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7bf4858b78-fcd9x" Jan 27 11:35:17 crc kubenswrapper[4775]: I0127 11:35:17.942786 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bf6d4f946-p9vts" Jan 27 11:35:18 crc kubenswrapper[4775]: I0127 11:35:18.197270 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6476466c7c-lb4h8" Jan 27 11:35:19 crc kubenswrapper[4775]: I0127 11:35:19.459595 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-54ccf4f85d-d7vhk" Jan 27 11:35:19 crc kubenswrapper[4775]: I0127 11:35:19.773749 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8" Jan 27 11:35:20 crc kubenswrapper[4775]: I0127 11:35:20.022700 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-76958f4d87-8js8k" Jan 27 11:35:29 crc kubenswrapper[4775]: I0127 11:35:29.517443 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:35:29 crc kubenswrapper[4775]: I0127 11:35:29.517996 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.401608 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-vkn58"] Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.403104 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.404902 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.405047 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-zhqjq" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.405245 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.405463 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.447614 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-vkn58"] Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.489004 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-wkh2v"] Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.491252 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.493733 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.496867 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-wkh2v"] Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.522231 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc2363e-089b-47c6-bb51-769dc3b41aef-config\") pod \"dnsmasq-dns-84bb9d8bd9-vkn58\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.522301 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jg7w\" (UniqueName: \"kubernetes.io/projected/5cc2363e-089b-47c6-bb51-769dc3b41aef-kube-api-access-8jg7w\") pod \"dnsmasq-dns-84bb9d8bd9-vkn58\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.623635 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-dns-svc\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.623687 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-config\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.624175 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc2363e-089b-47c6-bb51-769dc3b41aef-config\") pod \"dnsmasq-dns-84bb9d8bd9-vkn58\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.624625 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jg7w\" (UniqueName: \"kubernetes.io/projected/5cc2363e-089b-47c6-bb51-769dc3b41aef-kube-api-access-8jg7w\") pod \"dnsmasq-dns-84bb9d8bd9-vkn58\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.624758 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph6kn\" (UniqueName: \"kubernetes.io/projected/b7b04272-f555-4b48-8702-64db912ff8e8-kube-api-access-ph6kn\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.625937 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc2363e-089b-47c6-bb51-769dc3b41aef-config\") pod \"dnsmasq-dns-84bb9d8bd9-vkn58\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.648689 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jg7w\" (UniqueName: \"kubernetes.io/projected/5cc2363e-089b-47c6-bb51-769dc3b41aef-kube-api-access-8jg7w\") pod \"dnsmasq-dns-84bb9d8bd9-vkn58\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.724256 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.725627 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-dns-svc\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.725669 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-config\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.725759 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph6kn\" (UniqueName: \"kubernetes.io/projected/b7b04272-f555-4b48-8702-64db912ff8e8-kube-api-access-ph6kn\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.726761 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-dns-svc\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.726812 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-config\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.741592 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph6kn\" (UniqueName: \"kubernetes.io/projected/b7b04272-f555-4b48-8702-64db912ff8e8-kube-api-access-ph6kn\") pod \"dnsmasq-dns-5f854695bc-wkh2v\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:35 crc kubenswrapper[4775]: I0127 11:35:35.814311 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:35:36 crc kubenswrapper[4775]: I0127 11:35:36.194013 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 11:35:36 crc kubenswrapper[4775]: I0127 11:35:36.194589 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-vkn58"] Jan 27 11:35:36 crc kubenswrapper[4775]: I0127 11:35:36.261405 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-wkh2v"] Jan 27 11:35:36 crc kubenswrapper[4775]: W0127 11:35:36.262602 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7b04272_f555_4b48_8702_64db912ff8e8.slice/crio-d2ee6c93f1b9466da2dfda39bdf6bf1e19eec982ce2dc4c9cb57e47158a3c3d1 WatchSource:0}: Error finding container d2ee6c93f1b9466da2dfda39bdf6bf1e19eec982ce2dc4c9cb57e47158a3c3d1: Status 404 returned error can't find the container with id d2ee6c93f1b9466da2dfda39bdf6bf1e19eec982ce2dc4c9cb57e47158a3c3d1 Jan 27 11:35:37 crc kubenswrapper[4775]: I0127 11:35:37.178860 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" event={"ID":"b7b04272-f555-4b48-8702-64db912ff8e8","Type":"ContainerStarted","Data":"d2ee6c93f1b9466da2dfda39bdf6bf1e19eec982ce2dc4c9cb57e47158a3c3d1"} Jan 27 11:35:37 crc kubenswrapper[4775]: I0127 11:35:37.181601 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" event={"ID":"5cc2363e-089b-47c6-bb51-769dc3b41aef","Type":"ContainerStarted","Data":"a8604ffeb677e0cea30bef20c89a17cd47defbaf98fc694f2a3e1671b6855986"} Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.037120 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-wkh2v"] Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.067639 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-4xzdj"] Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.068881 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.072651 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-4xzdj"] Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.176999 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-config\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.177083 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.177232 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9dg2\" (UniqueName: \"kubernetes.io/projected/c7196167-1cda-485b-9bec-36ab0e666568-kube-api-access-d9dg2\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.279239 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9dg2\" (UniqueName: \"kubernetes.io/projected/c7196167-1cda-485b-9bec-36ab0e666568-kube-api-access-d9dg2\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.279391 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-config\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.280572 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-config\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.280734 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.281354 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-dns-svc\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.343133 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9dg2\" (UniqueName: \"kubernetes.io/projected/c7196167-1cda-485b-9bec-36ab0e666568-kube-api-access-d9dg2\") pod \"dnsmasq-dns-744ffd65bc-4xzdj\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.372614 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-vkn58"] Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.393943 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.400607 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-bzxbb"] Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.402541 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.407141 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-bzxbb"] Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.488630 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkn2z\" (UniqueName: \"kubernetes.io/projected/a0ffffa8-8199-4d59-927b-5563eda147fd-kube-api-access-qkn2z\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.488724 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-config\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.488791 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-dns-svc\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.594296 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-config\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.594362 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-dns-svc\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.594437 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkn2z\" (UniqueName: \"kubernetes.io/projected/a0ffffa8-8199-4d59-927b-5563eda147fd-kube-api-access-qkn2z\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.595831 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-config\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.596579 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-dns-svc\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.629802 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkn2z\" (UniqueName: \"kubernetes.io/projected/a0ffffa8-8199-4d59-927b-5563eda147fd-kube-api-access-qkn2z\") pod \"dnsmasq-dns-95f5f6995-bzxbb\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.748044 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:35:38 crc kubenswrapper[4775]: I0127 11:35:38.968084 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-4xzdj"] Jan 27 11:35:38 crc kubenswrapper[4775]: W0127 11:35:38.974633 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7196167_1cda_485b_9bec_36ab0e666568.slice/crio-23b427342d95b6e773ad84c1894a7f982f371e2925c24ce5c4881c1467a1c55e WatchSource:0}: Error finding container 23b427342d95b6e773ad84c1894a7f982f371e2925c24ce5c4881c1467a1c55e: Status 404 returned error can't find the container with id 23b427342d95b6e773ad84c1894a7f982f371e2925c24ce5c4881c1467a1c55e Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.198056 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" event={"ID":"c7196167-1cda-485b-9bec-36ab0e666568","Type":"ContainerStarted","Data":"23b427342d95b6e773ad84c1894a7f982f371e2925c24ce5c4881c1467a1c55e"} Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.228522 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.229616 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.231928 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.232127 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.232260 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-44htb" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.232389 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.238941 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.239369 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.239563 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.246416 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.255464 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-bzxbb"] Jan 27 11:35:39 crc kubenswrapper[4775]: W0127 11:35:39.277842 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0ffffa8_8199_4d59_927b_5563eda147fd.slice/crio-3b01344aca1f47a063297fbb9583a3d253a510b103b9f161a6d9fe9205de60d6 WatchSource:0}: Error finding container 3b01344aca1f47a063297fbb9583a3d253a510b103b9f161a6d9fe9205de60d6: Status 404 returned error can't find the container with id 3b01344aca1f47a063297fbb9583a3d253a510b103b9f161a6d9fe9205de60d6 Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340298 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340350 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340427 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340473 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-config-data\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340502 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwnfd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-kube-api-access-mwnfd\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340590 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ba029b-2296-4519-b6b1-04674355258f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340672 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340708 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ba029b-2296-4519-b6b1-04674355258f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340740 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340773 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.340803 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441729 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441786 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441815 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-config-data\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441843 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwnfd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-kube-api-access-mwnfd\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441888 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ba029b-2296-4519-b6b1-04674355258f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441911 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441947 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ba029b-2296-4519-b6b1-04674355258f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.441976 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.442015 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.442056 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.442085 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.442503 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.442803 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.442814 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.443148 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.443854 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-config-data\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.446669 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.450253 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.450393 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.454030 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ba029b-2296-4519-b6b1-04674355258f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.466993 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwnfd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-kube-api-access-mwnfd\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.466070 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ba029b-2296-4519-b6b1-04674355258f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.475643 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.524567 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.529595 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.532255 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-gp9fv" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.532278 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.532378 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.532691 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.534999 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.535180 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.536214 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.538889 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.562384 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646038 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/83263987-4e3c-4e95-9083-bb6a43f52410-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646087 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646115 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646149 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646193 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rgjg\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-kube-api-access-2rgjg\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646229 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646255 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646299 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/83263987-4e3c-4e95-9083-bb6a43f52410-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646320 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646362 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.646383 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.747532 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.747864 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/83263987-4e3c-4e95-9083-bb6a43f52410-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.747889 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.747917 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.747949 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rgjg\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-kube-api-access-2rgjg\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.747974 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.747993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.748032 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/83263987-4e3c-4e95-9083-bb6a43f52410-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.748047 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.748079 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.748099 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.748113 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.748843 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.749371 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.749528 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.750022 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.750732 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.753218 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/83263987-4e3c-4e95-9083-bb6a43f52410-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.765387 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/83263987-4e3c-4e95-9083-bb6a43f52410-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.768360 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.771313 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rgjg\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-kube-api-access-2rgjg\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.772315 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.778403 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:39 crc kubenswrapper[4775]: I0127 11:35:39.854153 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.117378 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:35:40 crc kubenswrapper[4775]: W0127 11:35:40.151307 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01ba029b_2296_4519_b6b1_04674355258f.slice/crio-3269a97665006c13d48ba616c9cd7abaebd71e3a1886cb0e13cd8dcf70fd57ec WatchSource:0}: Error finding container 3269a97665006c13d48ba616c9cd7abaebd71e3a1886cb0e13cd8dcf70fd57ec: Status 404 returned error can't find the container with id 3269a97665006c13d48ba616c9cd7abaebd71e3a1886cb0e13cd8dcf70fd57ec Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.232007 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" event={"ID":"a0ffffa8-8199-4d59-927b-5563eda147fd","Type":"ContainerStarted","Data":"3b01344aca1f47a063297fbb9583a3d253a510b103b9f161a6d9fe9205de60d6"} Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.233621 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01ba029b-2296-4519-b6b1-04674355258f","Type":"ContainerStarted","Data":"3269a97665006c13d48ba616c9cd7abaebd71e3a1886cb0e13cd8dcf70fd57ec"} Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.394641 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:35:40 crc kubenswrapper[4775]: W0127 11:35:40.408210 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83263987_4e3c_4e95_9083_bb6a43f52410.slice/crio-85a690e91079df6f4fe47bd15cd231753c08767dae9db9e6943a0ce49bec3588 WatchSource:0}: Error finding container 85a690e91079df6f4fe47bd15cd231753c08767dae9db9e6943a0ce49bec3588: Status 404 returned error can't find the container with id 85a690e91079df6f4fe47bd15cd231753c08767dae9db9e6943a0ce49bec3588 Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.681812 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.686129 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.688385 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.688424 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.688564 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-h6vhf" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.690957 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.691880 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.694381 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.760847 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-kolla-config\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.760910 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-config-data-default\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.760935 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.760956 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.760975 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4t2k\" (UniqueName: \"kubernetes.io/projected/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-kube-api-access-z4t2k\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.760988 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.761021 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.761043 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861778 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-kolla-config\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861836 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-config-data-default\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861859 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861882 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861906 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4t2k\" (UniqueName: \"kubernetes.io/projected/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-kube-api-access-z4t2k\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861922 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861945 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.861963 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.864006 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.864420 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-kolla-config\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.865130 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.865968 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.866049 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-config-data-default\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.881314 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.881853 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.891700 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4t2k\" (UniqueName: \"kubernetes.io/projected/9bafbfb6-d113-4a0f-a1dd-0d001a5448de-kube-api-access-z4t2k\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:40 crc kubenswrapper[4775]: I0127 11:35:40.894861 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"9bafbfb6-d113-4a0f-a1dd-0d001a5448de\") " pod="openstack/openstack-galera-0" Jan 27 11:35:41 crc kubenswrapper[4775]: I0127 11:35:41.011518 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 27 11:35:41 crc kubenswrapper[4775]: I0127 11:35:41.248211 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"83263987-4e3c-4e95-9083-bb6a43f52410","Type":"ContainerStarted","Data":"85a690e91079df6f4fe47bd15cd231753c08767dae9db9e6943a0ce49bec3588"} Jan 27 11:35:41 crc kubenswrapper[4775]: I0127 11:35:41.598890 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.041675 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.045481 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.048364 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.048407 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.049160 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.049524 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-t4hmx" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.067743 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.096136 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.096842 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.097128 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.097261 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.097338 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq7pv\" (UniqueName: \"kubernetes.io/projected/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-kube-api-access-pq7pv\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.097493 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.097646 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.097948 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.194895 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.197731 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.199509 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-dhmh7" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201057 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201118 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201177 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201218 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201239 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq7pv\" (UniqueName: \"kubernetes.io/projected/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-kube-api-access-pq7pv\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201351 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201377 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.201412 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.205296 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.206560 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.206749 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.206919 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.207222 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.208236 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.208431 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.223268 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.223678 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq7pv\" (UniqueName: \"kubernetes.io/projected/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-kube-api-access-pq7pv\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.230286 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.255163 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.261966 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6108f26d-5e0a-490c-a7a4-8cefa3b99c7d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d\") " pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.307813 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cc1808-c408-433d-aefa-f603408de606-combined-ca-bundle\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.307865 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggltd\" (UniqueName: \"kubernetes.io/projected/07cc1808-c408-433d-aefa-f603408de606-kube-api-access-ggltd\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.308173 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07cc1808-c408-433d-aefa-f603408de606-kolla-config\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.308372 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07cc1808-c408-433d-aefa-f603408de606-config-data\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.308486 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/07cc1808-c408-433d-aefa-f603408de606-memcached-tls-certs\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.375800 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.409804 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07cc1808-c408-433d-aefa-f603408de606-kolla-config\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.409983 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07cc1808-c408-433d-aefa-f603408de606-config-data\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.410050 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/07cc1808-c408-433d-aefa-f603408de606-memcached-tls-certs\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.410080 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cc1808-c408-433d-aefa-f603408de606-combined-ca-bundle\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.410111 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggltd\" (UniqueName: \"kubernetes.io/projected/07cc1808-c408-433d-aefa-f603408de606-kube-api-access-ggltd\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.411111 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/07cc1808-c408-433d-aefa-f603408de606-kolla-config\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.411771 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/07cc1808-c408-433d-aefa-f603408de606-config-data\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.418883 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/07cc1808-c408-433d-aefa-f603408de606-memcached-tls-certs\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.422082 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07cc1808-c408-433d-aefa-f603408de606-combined-ca-bundle\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.426762 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggltd\" (UniqueName: \"kubernetes.io/projected/07cc1808-c408-433d-aefa-f603408de606-kube-api-access-ggltd\") pod \"memcached-0\" (UID: \"07cc1808-c408-433d-aefa-f603408de606\") " pod="openstack/memcached-0" Jan 27 11:35:42 crc kubenswrapper[4775]: I0127 11:35:42.609912 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.328819 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.329957 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.333422 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-vjlch" Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.342065 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzdzx\" (UniqueName: \"kubernetes.io/projected/d650e06f-8d9a-443d-9045-82cef3c36ad3-kube-api-access-zzdzx\") pod \"kube-state-metrics-0\" (UID: \"d650e06f-8d9a-443d-9045-82cef3c36ad3\") " pod="openstack/kube-state-metrics-0" Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.347810 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.442538 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzdzx\" (UniqueName: \"kubernetes.io/projected/d650e06f-8d9a-443d-9045-82cef3c36ad3-kube-api-access-zzdzx\") pod \"kube-state-metrics-0\" (UID: \"d650e06f-8d9a-443d-9045-82cef3c36ad3\") " pod="openstack/kube-state-metrics-0" Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.493204 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzdzx\" (UniqueName: \"kubernetes.io/projected/d650e06f-8d9a-443d-9045-82cef3c36ad3-kube-api-access-zzdzx\") pod \"kube-state-metrics-0\" (UID: \"d650e06f-8d9a-443d-9045-82cef3c36ad3\") " pod="openstack/kube-state-metrics-0" Jan 27 11:35:44 crc kubenswrapper[4775]: I0127 11:35:44.654278 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.331078 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9bafbfb6-d113-4a0f-a1dd-0d001a5448de","Type":"ContainerStarted","Data":"faa33ccf16c94ee677fe261429dec55ad1899914247eb8bf91e4fa85aee22616"} Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.853228 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.854382 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.857254 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-nb9n9" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.857486 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.857500 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.857634 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.857733 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.869095 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899054 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899095 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7vvw\" (UniqueName: \"kubernetes.io/projected/09719e3d-fd6c-4c22-8c15-8ef911bc6598-kube-api-access-d7vvw\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899123 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899175 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899212 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09719e3d-fd6c-4c22-8c15-8ef911bc6598-config\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899232 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09719e3d-fd6c-4c22-8c15-8ef911bc6598-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899259 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:47 crc kubenswrapper[4775]: I0127 11:35:47.899304 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09719e3d-fd6c-4c22-8c15-8ef911bc6598-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.000989 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001051 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7vvw\" (UniqueName: \"kubernetes.io/projected/09719e3d-fd6c-4c22-8c15-8ef911bc6598-kube-api-access-d7vvw\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001084 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001143 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001183 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09719e3d-fd6c-4c22-8c15-8ef911bc6598-config\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001204 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09719e3d-fd6c-4c22-8c15-8ef911bc6598-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001231 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001282 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09719e3d-fd6c-4c22-8c15-8ef911bc6598-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.001699 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.002121 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/09719e3d-fd6c-4c22-8c15-8ef911bc6598-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.002489 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09719e3d-fd6c-4c22-8c15-8ef911bc6598-config\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.002845 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/09719e3d-fd6c-4c22-8c15-8ef911bc6598-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.008368 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.008686 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.010270 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/09719e3d-fd6c-4c22-8c15-8ef911bc6598-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.019229 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7vvw\" (UniqueName: \"kubernetes.io/projected/09719e3d-fd6c-4c22-8c15-8ef911bc6598-kube-api-access-d7vvw\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.029591 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-nb-0\" (UID: \"09719e3d-fd6c-4c22-8c15-8ef911bc6598\") " pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.186943 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.246267 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c4p9c"] Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.249190 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.268866 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4p9c"] Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.315673 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-utilities\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.315758 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpgqk\" (UniqueName: \"kubernetes.io/projected/32c63ae2-f837-485f-9f74-0606288c3666-kube-api-access-vpgqk\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.315923 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-catalog-content\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.417022 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-catalog-content\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.417114 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-utilities\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.417141 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpgqk\" (UniqueName: \"kubernetes.io/projected/32c63ae2-f837-485f-9f74-0606288c3666-kube-api-access-vpgqk\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.417573 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-catalog-content\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.417585 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-utilities\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.437202 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpgqk\" (UniqueName: \"kubernetes.io/projected/32c63ae2-f837-485f-9f74-0606288c3666-kube-api-access-vpgqk\") pod \"redhat-marketplace-c4p9c\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.583082 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.586428 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4hqln"] Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.587639 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.589622 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-nj7nm" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.589958 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.593891 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.596636 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-l9blz"] Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.597974 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.604879 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4hqln"] Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620182 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cacc7142-a8d4-4607-adb7-0090fbd3024a-combined-ca-bundle\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620230 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-run\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620256 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-lib\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620270 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-log-ovn\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620294 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cacc7142-a8d4-4607-adb7-0090fbd3024a-ovn-controller-tls-certs\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620314 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-run-ovn\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620331 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-run\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620368 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b06b991d-b108-4b21-82e5-43b3662c7aee-scripts\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620399 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cacc7142-a8d4-4607-adb7-0090fbd3024a-scripts\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620422 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-log\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620464 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn9rv\" (UniqueName: \"kubernetes.io/projected/b06b991d-b108-4b21-82e5-43b3662c7aee-kube-api-access-dn9rv\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620507 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-etc-ovs\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.620534 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7tsv\" (UniqueName: \"kubernetes.io/projected/cacc7142-a8d4-4607-adb7-0090fbd3024a-kube-api-access-k7tsv\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.622000 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-l9blz"] Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722316 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cacc7142-a8d4-4607-adb7-0090fbd3024a-ovn-controller-tls-certs\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722398 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-run-ovn\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722425 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-run\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722478 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b06b991d-b108-4b21-82e5-43b3662c7aee-scripts\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cacc7142-a8d4-4607-adb7-0090fbd3024a-scripts\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722537 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-log\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722560 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dn9rv\" (UniqueName: \"kubernetes.io/projected/b06b991d-b108-4b21-82e5-43b3662c7aee-kube-api-access-dn9rv\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722610 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-etc-ovs\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722647 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7tsv\" (UniqueName: \"kubernetes.io/projected/cacc7142-a8d4-4607-adb7-0090fbd3024a-kube-api-access-k7tsv\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722679 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cacc7142-a8d4-4607-adb7-0090fbd3024a-combined-ca-bundle\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722698 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-run\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722720 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-lib\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.722733 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-log-ovn\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.723291 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-log-ovn\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.723417 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-run\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.723492 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-lib\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.723542 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-run\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.723984 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/cacc7142-a8d4-4607-adb7-0090fbd3024a-var-run-ovn\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.724142 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-etc-ovs\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.724260 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b06b991d-b108-4b21-82e5-43b3662c7aee-var-log\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.726071 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cacc7142-a8d4-4607-adb7-0090fbd3024a-scripts\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.732806 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cacc7142-a8d4-4607-adb7-0090fbd3024a-combined-ca-bundle\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.737008 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/cacc7142-a8d4-4607-adb7-0090fbd3024a-ovn-controller-tls-certs\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.742057 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7tsv\" (UniqueName: \"kubernetes.io/projected/cacc7142-a8d4-4607-adb7-0090fbd3024a-kube-api-access-k7tsv\") pod \"ovn-controller-4hqln\" (UID: \"cacc7142-a8d4-4607-adb7-0090fbd3024a\") " pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.742125 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn9rv\" (UniqueName: \"kubernetes.io/projected/b06b991d-b108-4b21-82e5-43b3662c7aee-kube-api-access-dn9rv\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.742425 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b06b991d-b108-4b21-82e5-43b3662c7aee-scripts\") pod \"ovn-controller-ovs-l9blz\" (UID: \"b06b991d-b108-4b21-82e5-43b3662c7aee\") " pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.906212 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4hqln" Jan 27 11:35:48 crc kubenswrapper[4775]: I0127 11:35:48.915358 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.038234 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.039735 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.042463 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.042687 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-gvhkg" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.042849 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.045161 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.058205 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.161930 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.162011 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.162042 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.162068 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.162091 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb252ada-9191-4d2d-8ab9-d12f4668a35a-config\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.162138 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb252ada-9191-4d2d-8ab9-d12f4668a35a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.162208 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb252ada-9191-4d2d-8ab9-d12f4668a35a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.162395 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knqj7\" (UniqueName: \"kubernetes.io/projected/fb252ada-9191-4d2d-8ab9-d12f4668a35a-kube-api-access-knqj7\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.234023 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2hp57"] Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.236036 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.241878 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hp57"] Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.268139 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb252ada-9191-4d2d-8ab9-d12f4668a35a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.270730 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb252ada-9191-4d2d-8ab9-d12f4668a35a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.270847 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knqj7\" (UniqueName: \"kubernetes.io/projected/fb252ada-9191-4d2d-8ab9-d12f4668a35a-kube-api-access-knqj7\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.270883 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.270938 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.270963 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.270982 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.271734 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fb252ada-9191-4d2d-8ab9-d12f4668a35a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.271809 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb252ada-9191-4d2d-8ab9-d12f4668a35a-config\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.272767 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb252ada-9191-4d2d-8ab9-d12f4668a35a-config\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.273434 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.274022 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fb252ada-9191-4d2d-8ab9-d12f4668a35a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.283208 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.294838 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.298605 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knqj7\" (UniqueName: \"kubernetes.io/projected/fb252ada-9191-4d2d-8ab9-d12f4668a35a-kube-api-access-knqj7\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.300175 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb252ada-9191-4d2d-8ab9-d12f4668a35a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.309925 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fb252ada-9191-4d2d-8ab9-d12f4668a35a\") " pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.367622 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.374126 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-catalog-content\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.374202 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-utilities\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.374248 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjzm\" (UniqueName: \"kubernetes.io/projected/f7ac68bf-cd99-4022-af50-a73ddc6181b0-kube-api-access-9jjzm\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.476240 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjzm\" (UniqueName: \"kubernetes.io/projected/f7ac68bf-cd99-4022-af50-a73ddc6181b0-kube-api-access-9jjzm\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.476384 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-catalog-content\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.476422 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-utilities\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.476880 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-utilities\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.480246 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-catalog-content\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.497204 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjzm\" (UniqueName: \"kubernetes.io/projected/f7ac68bf-cd99-4022-af50-a73ddc6181b0-kube-api-access-9jjzm\") pod \"certified-operators-2hp57\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:51 crc kubenswrapper[4775]: I0127 11:35:51.552889 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:35:57 crc kubenswrapper[4775]: E0127 11:35:57.294433 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d" Jan 27 11:35:57 crc kubenswrapper[4775]: E0127 11:35:57.295264 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2rgjg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(83263987-4e3c-4e95-9083-bb6a43f52410): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:35:57 crc kubenswrapper[4775]: E0127 11:35:57.296509 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" Jan 27 11:35:57 crc kubenswrapper[4775]: E0127 11:35:57.403905 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq@sha256:e733252aab7f4bc0efbdd712bcd88e44c5498bf1773dba843bc9dcfac324fe3d\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.070100 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.070307 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qkn2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-95f5f6995-bzxbb_openstack(a0ffffa8-8199-4d59-927b-5563eda147fd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.071585 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.079869 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.079980 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d9dg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-744ffd65bc-4xzdj_openstack(c7196167-1cda-485b-9bec-36ab0e666568): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.081316 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" podUID="c7196167-1cda-485b-9bec-36ab0e666568" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.099716 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.099844 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8jg7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-84bb9d8bd9-vkn58_openstack(5cc2363e-089b-47c6-bb51-769dc3b41aef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.101045 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" podUID="5cc2363e-089b-47c6-bb51-769dc3b41aef" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.138269 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.138444 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ph6kn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5f854695bc-wkh2v_openstack(b7b04272-f555-4b48-8702-64db912ff8e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.139677 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" podUID="b7b04272-f555-4b48-8702-64db912ff8e8" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.403687 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33\\\"\"" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" podUID="c7196167-1cda-485b-9bec-36ab0e666568" Jan 27 11:35:58 crc kubenswrapper[4775]: E0127 11:35:58.404115 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:ea0bf67f1aa5d95a9a07b9c8692c293470f1311792c55d3d57f1f92e56689c33\\\"\"" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" Jan 27 11:35:59 crc kubenswrapper[4775]: I0127 11:35:59.518094 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:35:59 crc kubenswrapper[4775]: I0127 11:35:59.518401 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.417497 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" event={"ID":"5cc2363e-089b-47c6-bb51-769dc3b41aef","Type":"ContainerDied","Data":"a8604ffeb677e0cea30bef20c89a17cd47defbaf98fc694f2a3e1671b6855986"} Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.417794 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8604ffeb677e0cea30bef20c89a17cd47defbaf98fc694f2a3e1671b6855986" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.421385 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" event={"ID":"b7b04272-f555-4b48-8702-64db912ff8e8","Type":"ContainerDied","Data":"d2ee6c93f1b9466da2dfda39bdf6bf1e19eec982ce2dc4c9cb57e47158a3c3d1"} Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.421430 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2ee6c93f1b9466da2dfda39bdf6bf1e19eec982ce2dc4c9cb57e47158a3c3d1" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.494604 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.506668 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.626879 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-dns-svc\") pod \"b7b04272-f555-4b48-8702-64db912ff8e8\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.627257 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph6kn\" (UniqueName: \"kubernetes.io/projected/b7b04272-f555-4b48-8702-64db912ff8e8-kube-api-access-ph6kn\") pod \"b7b04272-f555-4b48-8702-64db912ff8e8\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.627307 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jg7w\" (UniqueName: \"kubernetes.io/projected/5cc2363e-089b-47c6-bb51-769dc3b41aef-kube-api-access-8jg7w\") pod \"5cc2363e-089b-47c6-bb51-769dc3b41aef\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.627408 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-config\") pod \"b7b04272-f555-4b48-8702-64db912ff8e8\" (UID: \"b7b04272-f555-4b48-8702-64db912ff8e8\") " Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.627434 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc2363e-089b-47c6-bb51-769dc3b41aef-config\") pod \"5cc2363e-089b-47c6-bb51-769dc3b41aef\" (UID: \"5cc2363e-089b-47c6-bb51-769dc3b41aef\") " Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.627432 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b7b04272-f555-4b48-8702-64db912ff8e8" (UID: "b7b04272-f555-4b48-8702-64db912ff8e8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.627830 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.628002 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-config" (OuterVolumeSpecName: "config") pod "b7b04272-f555-4b48-8702-64db912ff8e8" (UID: "b7b04272-f555-4b48-8702-64db912ff8e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.628283 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cc2363e-089b-47c6-bb51-769dc3b41aef-config" (OuterVolumeSpecName: "config") pod "5cc2363e-089b-47c6-bb51-769dc3b41aef" (UID: "5cc2363e-089b-47c6-bb51-769dc3b41aef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.634313 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cc2363e-089b-47c6-bb51-769dc3b41aef-kube-api-access-8jg7w" (OuterVolumeSpecName: "kube-api-access-8jg7w") pod "5cc2363e-089b-47c6-bb51-769dc3b41aef" (UID: "5cc2363e-089b-47c6-bb51-769dc3b41aef"). InnerVolumeSpecName "kube-api-access-8jg7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.635904 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b04272-f555-4b48-8702-64db912ff8e8-kube-api-access-ph6kn" (OuterVolumeSpecName: "kube-api-access-ph6kn") pod "b7b04272-f555-4b48-8702-64db912ff8e8" (UID: "b7b04272-f555-4b48-8702-64db912ff8e8"). InnerVolumeSpecName "kube-api-access-ph6kn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.729367 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jg7w\" (UniqueName: \"kubernetes.io/projected/5cc2363e-089b-47c6-bb51-769dc3b41aef-kube-api-access-8jg7w\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.729401 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7b04272-f555-4b48-8702-64db912ff8e8-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.729413 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cc2363e-089b-47c6-bb51-769dc3b41aef-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:00 crc kubenswrapper[4775]: I0127 11:36:00.729425 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph6kn\" (UniqueName: \"kubernetes.io/projected/b7b04272-f555-4b48-8702-64db912ff8e8-kube-api-access-ph6kn\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.232702 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4hqln"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.252057 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.260204 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.267294 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 27 11:36:01 crc kubenswrapper[4775]: W0127 11:36:01.278927 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6108f26d_5e0a_490c_a7a4_8cefa3b99c7d.slice/crio-9739559ce07c1c35544d9bf263fe84c6d806047f757668bd795979f346bc4b93 WatchSource:0}: Error finding container 9739559ce07c1c35544d9bf263fe84c6d806047f757668bd795979f346bc4b93: Status 404 returned error can't find the container with id 9739559ce07c1c35544d9bf263fe84c6d806047f757668bd795979f346bc4b93 Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.318185 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4p9c"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.348820 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hp57"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.393861 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-l9blz"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.435605 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"07cc1808-c408-433d-aefa-f603408de606","Type":"ContainerStarted","Data":"325fbdc7382072689a0cdf3c2a42c3b72df96fb34e54cc553f116073ccfaaf84"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.437144 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4hqln" event={"ID":"cacc7142-a8d4-4607-adb7-0090fbd3024a","Type":"ContainerStarted","Data":"afb98490afa7e426904643804172eb216595d47e10150753b8f0072725543d31"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.438709 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d650e06f-8d9a-443d-9045-82cef3c36ad3","Type":"ContainerStarted","Data":"8c517699b915acc52e0019dc1c45d2e9a3ea6904e06f7498f332512ca9be5304"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.441937 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hp57" event={"ID":"f7ac68bf-cd99-4022-af50-a73ddc6181b0","Type":"ContainerStarted","Data":"d68aa08b8c10efd267dbb532a84a73914540135473560968b1351b3eea784ca0"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.447516 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d","Type":"ContainerStarted","Data":"9739559ce07c1c35544d9bf263fe84c6d806047f757668bd795979f346bc4b93"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.448577 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4p9c" event={"ID":"32c63ae2-f837-485f-9f74-0606288c3666","Type":"ContainerStarted","Data":"3c9369265622ba39ffc877111bca425ea61080e3b7bb1ee8ddc44e387299ce63"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.449574 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l9blz" event={"ID":"b06b991d-b108-4b21-82e5-43b3662c7aee","Type":"ContainerStarted","Data":"2f0eac791862efa9ae20f0c460c94f3f0d3d8c62d5c60cb786f51daaccd12a67"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.450670 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-84bb9d8bd9-vkn58" Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.453758 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f854695bc-wkh2v" Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.453871 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9bafbfb6-d113-4a0f-a1dd-0d001a5448de","Type":"ContainerStarted","Data":"6530e42900cb10d3b44da1f3748193697a8d76b8e0700bb184bc17e4bf83e4a2"} Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.603168 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-vkn58"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.613442 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-84bb9d8bd9-vkn58"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.634576 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-wkh2v"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.645472 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f854695bc-wkh2v"] Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.760822 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cc2363e-089b-47c6-bb51-769dc3b41aef" path="/var/lib/kubelet/pods/5cc2363e-089b-47c6-bb51-769dc3b41aef/volumes" Jan 27 11:36:01 crc kubenswrapper[4775]: I0127 11:36:01.764662 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b04272-f555-4b48-8702-64db912ff8e8" path="/var/lib/kubelet/pods/b7b04272-f555-4b48-8702-64db912ff8e8/volumes" Jan 27 11:36:02 crc kubenswrapper[4775]: I0127 11:36:02.210709 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 27 11:36:02 crc kubenswrapper[4775]: W0127 11:36:02.213707 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb252ada_9191_4d2d_8ab9_d12f4668a35a.slice/crio-d6c900e6adc6a0b4e94b147e02216692a11ad65e67d333fc459319a8154aabd6 WatchSource:0}: Error finding container d6c900e6adc6a0b4e94b147e02216692a11ad65e67d333fc459319a8154aabd6: Status 404 returned error can't find the container with id d6c900e6adc6a0b4e94b147e02216692a11ad65e67d333fc459319a8154aabd6 Jan 27 11:36:02 crc kubenswrapper[4775]: I0127 11:36:02.467086 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fb252ada-9191-4d2d-8ab9-d12f4668a35a","Type":"ContainerStarted","Data":"d6c900e6adc6a0b4e94b147e02216692a11ad65e67d333fc459319a8154aabd6"} Jan 27 11:36:02 crc kubenswrapper[4775]: I0127 11:36:02.470278 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 27 11:36:03 crc kubenswrapper[4775]: I0127 11:36:03.479936 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"09719e3d-fd6c-4c22-8c15-8ef911bc6598","Type":"ContainerStarted","Data":"ea65f3829f0e7ba1ea821bc67e68f186029a80db38866cc31a1e47e34ca5b8ba"} Jan 27 11:36:05 crc kubenswrapper[4775]: I0127 11:36:05.497919 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01ba029b-2296-4519-b6b1-04674355258f","Type":"ContainerStarted","Data":"74bb5b1c930971f4fe9c5d05e3295a42d673f050d9c75ec7b42c0aa8e59510ca"} Jan 27 11:36:05 crc kubenswrapper[4775]: I0127 11:36:05.503894 4775 generic.go:334] "Generic (PLEG): container finished" podID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerID="a7078e3f79ebaf3e143c08c9b5d9ba3454399bc72621c179ec98e87d8ca953ac" exitCode=0 Jan 27 11:36:05 crc kubenswrapper[4775]: I0127 11:36:05.503973 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hp57" event={"ID":"f7ac68bf-cd99-4022-af50-a73ddc6181b0","Type":"ContainerDied","Data":"a7078e3f79ebaf3e143c08c9b5d9ba3454399bc72621c179ec98e87d8ca953ac"} Jan 27 11:36:05 crc kubenswrapper[4775]: I0127 11:36:05.507490 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d","Type":"ContainerStarted","Data":"483189a828072cd5ee41fa53f904c71c2a5f7f660672f0172f3cf62b12572414"} Jan 27 11:36:05 crc kubenswrapper[4775]: I0127 11:36:05.512350 4775 generic.go:334] "Generic (PLEG): container finished" podID="32c63ae2-f837-485f-9f74-0606288c3666" containerID="8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5" exitCode=0 Jan 27 11:36:05 crc kubenswrapper[4775]: I0127 11:36:05.512487 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4p9c" event={"ID":"32c63ae2-f837-485f-9f74-0606288c3666","Type":"ContainerDied","Data":"8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5"} Jan 27 11:36:08 crc kubenswrapper[4775]: I0127 11:36:08.534596 4775 generic.go:334] "Generic (PLEG): container finished" podID="9bafbfb6-d113-4a0f-a1dd-0d001a5448de" containerID="6530e42900cb10d3b44da1f3748193697a8d76b8e0700bb184bc17e4bf83e4a2" exitCode=0 Jan 27 11:36:08 crc kubenswrapper[4775]: I0127 11:36:08.534677 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9bafbfb6-d113-4a0f-a1dd-0d001a5448de","Type":"ContainerDied","Data":"6530e42900cb10d3b44da1f3748193697a8d76b8e0700bb184bc17e4bf83e4a2"} Jan 27 11:36:09 crc kubenswrapper[4775]: I0127 11:36:09.550628 4775 generic.go:334] "Generic (PLEG): container finished" podID="6108f26d-5e0a-490c-a7a4-8cefa3b99c7d" containerID="483189a828072cd5ee41fa53f904c71c2a5f7f660672f0172f3cf62b12572414" exitCode=0 Jan 27 11:36:09 crc kubenswrapper[4775]: I0127 11:36:09.550690 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d","Type":"ContainerDied","Data":"483189a828072cd5ee41fa53f904c71c2a5f7f660672f0172f3cf62b12572414"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.592082 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9bafbfb6-d113-4a0f-a1dd-0d001a5448de","Type":"ContainerStarted","Data":"c88e0678799b6a493451f75ba923d4eb4c771df920e39200189f796c2acf1415"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.600796 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4hqln" event={"ID":"cacc7142-a8d4-4607-adb7-0090fbd3024a","Type":"ContainerStarted","Data":"25719052291ce0784c5cdefd82e34f01bac4601d4f6545a9d38f8ae2876c9ef3"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.600903 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-4hqln" Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.604040 4775 generic.go:334] "Generic (PLEG): container finished" podID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerID="862553ece09ec7abc1ec1a84f1cafbd9dd0b4ae450db1c4c095ba98bfbf00ead" exitCode=0 Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.604192 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hp57" event={"ID":"f7ac68bf-cd99-4022-af50-a73ddc6181b0","Type":"ContainerDied","Data":"862553ece09ec7abc1ec1a84f1cafbd9dd0b4ae450db1c4c095ba98bfbf00ead"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.607504 4775 generic.go:334] "Generic (PLEG): container finished" podID="32c63ae2-f837-485f-9f74-0606288c3666" containerID="ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c" exitCode=0 Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.607658 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4p9c" event={"ID":"32c63ae2-f837-485f-9f74-0606288c3666","Type":"ContainerDied","Data":"ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.614514 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.026760383 podStartE2EDuration="32.614497814s" podCreationTimestamp="2026-01-27 11:35:39 +0000 UTC" firstStartedPulling="2026-01-27 11:35:46.850752625 +0000 UTC m=+925.992350402" lastFinishedPulling="2026-01-27 11:36:00.438490046 +0000 UTC m=+939.580087833" observedRunningTime="2026-01-27 11:36:11.613586248 +0000 UTC m=+950.755184035" watchObservedRunningTime="2026-01-27 11:36:11.614497814 +0000 UTC m=+950.756095591" Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.616059 4775 generic.go:334] "Generic (PLEG): container finished" podID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerID="357b113e0ab8b0acfafd5e8a4b10ed58eb7061e0cf48c4acf628d4887c7b99da" exitCode=0 Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.616108 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" event={"ID":"a0ffffa8-8199-4d59-927b-5563eda147fd","Type":"ContainerDied","Data":"357b113e0ab8b0acfafd5e8a4b10ed58eb7061e0cf48c4acf628d4887c7b99da"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.620014 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l9blz" event={"ID":"b06b991d-b108-4b21-82e5-43b3662c7aee","Type":"ContainerStarted","Data":"a990d77a3cf0838c74e707e18849c6485ec63481f4ef3b07356d3e8995dbd108"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.628501 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"09719e3d-fd6c-4c22-8c15-8ef911bc6598","Type":"ContainerStarted","Data":"2b3a4e1cf69bcdc101b9c3a20d0c3eb06f0e0a30352b47c0121c787b62e10c75"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.632084 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d650e06f-8d9a-443d-9045-82cef3c36ad3","Type":"ContainerStarted","Data":"41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.632300 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.635632 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"83263987-4e3c-4e95-9083-bb6a43f52410","Type":"ContainerStarted","Data":"235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.649766 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"6108f26d-5e0a-490c-a7a4-8cefa3b99c7d","Type":"ContainerStarted","Data":"559b79147016a36fa1d81a8459f89139f84ea56d20c9fb27270d55252a8da1c8"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.651626 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fb252ada-9191-4d2d-8ab9-d12f4668a35a","Type":"ContainerStarted","Data":"09edec2805bedc01877cbe4c65bbfc4e8b77a8b718dcb6d0a4db1de2befc8064"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.656860 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"07cc1808-c408-433d-aefa-f603408de606","Type":"ContainerStarted","Data":"df0a956ea4f5334c7744ad328c65be6ec217655067de8f9c23f23040ea40c16a"} Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.657029 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.687204 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-4hqln" podStartSLOduration=15.149826773 podStartE2EDuration="23.687174357s" podCreationTimestamp="2026-01-27 11:35:48 +0000 UTC" firstStartedPulling="2026-01-27 11:36:01.24821295 +0000 UTC m=+940.389810727" lastFinishedPulling="2026-01-27 11:36:09.785560484 +0000 UTC m=+948.927158311" observedRunningTime="2026-01-27 11:36:11.67775071 +0000 UTC m=+950.819348507" watchObservedRunningTime="2026-01-27 11:36:11.687174357 +0000 UTC m=+950.828772154" Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.724086 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=30.724068495 podStartE2EDuration="30.724068495s" podCreationTimestamp="2026-01-27 11:35:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:11.717071224 +0000 UTC m=+950.858669021" watchObservedRunningTime="2026-01-27 11:36:11.724068495 +0000 UTC m=+950.865666272" Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.766796 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=18.691000081 podStartE2EDuration="27.766777011s" podCreationTimestamp="2026-01-27 11:35:44 +0000 UTC" firstStartedPulling="2026-01-27 11:36:01.276105071 +0000 UTC m=+940.417702838" lastFinishedPulling="2026-01-27 11:36:10.351881981 +0000 UTC m=+949.493479768" observedRunningTime="2026-01-27 11:36:11.761140247 +0000 UTC m=+950.902738024" watchObservedRunningTime="2026-01-27 11:36:11.766777011 +0000 UTC m=+950.908374788" Jan 27 11:36:11 crc kubenswrapper[4775]: I0127 11:36:11.807464 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=21.385546539 podStartE2EDuration="29.807433621s" podCreationTimestamp="2026-01-27 11:35:42 +0000 UTC" firstStartedPulling="2026-01-27 11:36:01.248300272 +0000 UTC m=+940.389898059" lastFinishedPulling="2026-01-27 11:36:09.670187354 +0000 UTC m=+948.811785141" observedRunningTime="2026-01-27 11:36:11.798255171 +0000 UTC m=+950.939852948" watchObservedRunningTime="2026-01-27 11:36:11.807433621 +0000 UTC m=+950.949031398" Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.382039 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.382412 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.665086 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hp57" event={"ID":"f7ac68bf-cd99-4022-af50-a73ddc6181b0","Type":"ContainerStarted","Data":"c77bc3df0ef278fa6252111f5fae5b862f83e4845c5181bed73e9f84cf00a7b4"} Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.667520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4p9c" event={"ID":"32c63ae2-f837-485f-9f74-0606288c3666","Type":"ContainerStarted","Data":"6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c"} Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.675627 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" event={"ID":"a0ffffa8-8199-4d59-927b-5563eda147fd","Type":"ContainerStarted","Data":"8be4620fe03275aaa5212ee572a3a0d887cfc63a9b8b6239245c1bca75f7d04b"} Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.676172 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.678491 4775 generic.go:334] "Generic (PLEG): container finished" podID="b06b991d-b108-4b21-82e5-43b3662c7aee" containerID="a990d77a3cf0838c74e707e18849c6485ec63481f4ef3b07356d3e8995dbd108" exitCode=0 Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.681618 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l9blz" event={"ID":"b06b991d-b108-4b21-82e5-43b3662c7aee","Type":"ContainerDied","Data":"a990d77a3cf0838c74e707e18849c6485ec63481f4ef3b07356d3e8995dbd108"} Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.685679 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2hp57" podStartSLOduration=15.130154556 podStartE2EDuration="21.685665037s" podCreationTimestamp="2026-01-27 11:35:51 +0000 UTC" firstStartedPulling="2026-01-27 11:36:05.506423941 +0000 UTC m=+944.648021718" lastFinishedPulling="2026-01-27 11:36:12.061934422 +0000 UTC m=+951.203532199" observedRunningTime="2026-01-27 11:36:12.682008326 +0000 UTC m=+951.823606103" watchObservedRunningTime="2026-01-27 11:36:12.685665037 +0000 UTC m=+951.827262804" Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.701789 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" podStartSLOduration=3.67405448 podStartE2EDuration="34.701771976s" podCreationTimestamp="2026-01-27 11:35:38 +0000 UTC" firstStartedPulling="2026-01-27 11:35:39.28006511 +0000 UTC m=+918.421662877" lastFinishedPulling="2026-01-27 11:36:10.307782596 +0000 UTC m=+949.449380373" observedRunningTime="2026-01-27 11:36:12.696793581 +0000 UTC m=+951.838391348" watchObservedRunningTime="2026-01-27 11:36:12.701771976 +0000 UTC m=+951.843369753" Jan 27 11:36:12 crc kubenswrapper[4775]: I0127 11:36:12.742623 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c4p9c" podStartSLOduration=17.921028783 podStartE2EDuration="24.742605281s" podCreationTimestamp="2026-01-27 11:35:48 +0000 UTC" firstStartedPulling="2026-01-27 11:36:05.51407352 +0000 UTC m=+944.655671307" lastFinishedPulling="2026-01-27 11:36:12.335650028 +0000 UTC m=+951.477247805" observedRunningTime="2026-01-27 11:36:12.742251882 +0000 UTC m=+951.883849659" watchObservedRunningTime="2026-01-27 11:36:12.742605281 +0000 UTC m=+951.884203058" Jan 27 11:36:14 crc kubenswrapper[4775]: I0127 11:36:14.693202 4775 generic.go:334] "Generic (PLEG): container finished" podID="c7196167-1cda-485b-9bec-36ab0e666568" containerID="97eb2ae0d47bf6851995b105d37a65888384ea986fa2a3b3f741906dd431a2f6" exitCode=0 Jan 27 11:36:14 crc kubenswrapper[4775]: I0127 11:36:14.693876 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" event={"ID":"c7196167-1cda-485b-9bec-36ab0e666568","Type":"ContainerDied","Data":"97eb2ae0d47bf6851995b105d37a65888384ea986fa2a3b3f741906dd431a2f6"} Jan 27 11:36:14 crc kubenswrapper[4775]: I0127 11:36:14.710910 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fb252ada-9191-4d2d-8ab9-d12f4668a35a","Type":"ContainerStarted","Data":"91043a0286a475941ff1e2e15ebb1ad2db71ddc9330b5fcf6a29f56fdf7de1f4"} Jan 27 11:36:14 crc kubenswrapper[4775]: I0127 11:36:14.715229 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l9blz" event={"ID":"b06b991d-b108-4b21-82e5-43b3662c7aee","Type":"ContainerStarted","Data":"512935f0d3a93c8d67e1544c268864b15cd99b14f0311dd4e91ce2d818961543"} Jan 27 11:36:14 crc kubenswrapper[4775]: I0127 11:36:14.716701 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"09719e3d-fd6c-4c22-8c15-8ef911bc6598","Type":"ContainerStarted","Data":"91f6dd66343e9ed907e5d0f090a63d8fc62fde24a06bc31cbe61712fbc467f0a"} Jan 27 11:36:14 crc kubenswrapper[4775]: I0127 11:36:14.771691 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=12.648922913 podStartE2EDuration="24.771670295s" podCreationTimestamp="2026-01-27 11:35:50 +0000 UTC" firstStartedPulling="2026-01-27 11:36:02.215349052 +0000 UTC m=+941.356946829" lastFinishedPulling="2026-01-27 11:36:14.338096434 +0000 UTC m=+953.479694211" observedRunningTime="2026-01-27 11:36:14.760217092 +0000 UTC m=+953.901814889" watchObservedRunningTime="2026-01-27 11:36:14.771670295 +0000 UTC m=+953.913268082" Jan 27 11:36:14 crc kubenswrapper[4775]: I0127 11:36:14.795890 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=16.957278334 podStartE2EDuration="28.795850426s" podCreationTimestamp="2026-01-27 11:35:46 +0000 UTC" firstStartedPulling="2026-01-27 11:36:02.482542359 +0000 UTC m=+941.624140136" lastFinishedPulling="2026-01-27 11:36:14.321114461 +0000 UTC m=+953.462712228" observedRunningTime="2026-01-27 11:36:14.788904166 +0000 UTC m=+953.930501953" watchObservedRunningTime="2026-01-27 11:36:14.795850426 +0000 UTC m=+953.937448313" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.187715 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.230889 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.368843 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.410043 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.726255 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" event={"ID":"c7196167-1cda-485b-9bec-36ab0e666568","Type":"ContainerStarted","Data":"ab7d80585c73c2935a1546f42ec8127d8f07e4ebfcf89fc16e590bf9f313fdc3"} Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.726646 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.730096 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-l9blz" event={"ID":"b06b991d-b108-4b21-82e5-43b3662c7aee","Type":"ContainerStarted","Data":"58471d545306326985d3bc8a879bfb69c3624a87f6ef783ea2f890ec8db36211"} Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.730573 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.730635 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.756549 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" podStartSLOduration=-9223371999.098242 podStartE2EDuration="37.756533802s" podCreationTimestamp="2026-01-27 11:35:38 +0000 UTC" firstStartedPulling="2026-01-27 11:35:38.976109389 +0000 UTC m=+918.117707166" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:15.75499285 +0000 UTC m=+954.896590647" watchObservedRunningTime="2026-01-27 11:36:15.756533802 +0000 UTC m=+954.898131579" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.777581 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.790264 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.795736 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-l9blz" podStartSLOduration=19.059865316 podStartE2EDuration="27.795718272s" podCreationTimestamp="2026-01-27 11:35:48 +0000 UTC" firstStartedPulling="2026-01-27 11:36:01.42982411 +0000 UTC m=+940.571421897" lastFinishedPulling="2026-01-27 11:36:10.165677076 +0000 UTC m=+949.307274853" observedRunningTime="2026-01-27 11:36:15.79162992 +0000 UTC m=+954.933227687" watchObservedRunningTime="2026-01-27 11:36:15.795718272 +0000 UTC m=+954.937316059" Jan 27 11:36:15 crc kubenswrapper[4775]: I0127 11:36:15.986704 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-4xzdj"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.026161 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kgvb6"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.027510 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.031350 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.049682 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kgvb6"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.064344 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-9xncr"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.065466 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.067204 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.075109 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9xncr"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.131615 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.131707 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-ovs-rundir\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.131753 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr7mn\" (UniqueName: \"kubernetes.io/projected/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-kube-api-access-vr7mn\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.131786 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-config\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.131863 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-ovn-rundir\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.131912 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztkr9\" (UniqueName: \"kubernetes.io/projected/de956838-03d3-41d8-96d3-a85293eff207-kube-api-access-ztkr9\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.131930 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-dns-svc\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.132011 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-combined-ca-bundle\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.132058 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-config\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.132114 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-ovsdbserver-sb\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.209260 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-bzxbb"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.209622 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerName="dnsmasq-dns" containerID="cri-o://8be4620fe03275aaa5212ee572a3a0d887cfc63a9b8b6239245c1bca75f7d04b" gracePeriod=10 Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.210575 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.233786 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-config\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.233872 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-ovn-rundir\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.233921 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztkr9\" (UniqueName: \"kubernetes.io/projected/de956838-03d3-41d8-96d3-a85293eff207-kube-api-access-ztkr9\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.233948 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-dns-svc\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.233992 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-combined-ca-bundle\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.234022 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-config\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.234056 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-ovsdbserver-sb\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.234110 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.234134 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-ovs-rundir\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.234161 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr7mn\" (UniqueName: \"kubernetes.io/projected/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-kube-api-access-vr7mn\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.235615 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-ovsdbserver-sb\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.235627 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-config\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.236218 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-config\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.236634 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-ovn-rundir\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.236673 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-ovs-rundir\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.237192 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-dns-svc\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.243001 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.247169 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-combined-ca-bundle\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.253808 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztkr9\" (UniqueName: \"kubernetes.io/projected/de956838-03d3-41d8-96d3-a85293eff207-kube-api-access-ztkr9\") pod \"dnsmasq-dns-794868bd45-kgvb6\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.254514 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-t2sfn"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.256194 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr7mn\" (UniqueName: \"kubernetes.io/projected/7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5-kube-api-access-vr7mn\") pod \"ovn-controller-metrics-9xncr\" (UID: \"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5\") " pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.257439 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.268528 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.281250 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-t2sfn"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.338663 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5q2x\" (UniqueName: \"kubernetes.io/projected/31d3ee22-9b3b-46ac-b896-ba5c521e1753-kube-api-access-m5q2x\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.339032 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.339158 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.339197 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-config\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.339231 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.347636 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.351045 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.353539 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.356218 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.356419 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-4q76m" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.357073 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.359816 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.365664 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.378348 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9xncr" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.440656 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6bb656eb-1eea-436d-acf3-6d8a548a97e5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.440755 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bb656eb-1eea-436d-acf3-6d8a548a97e5-scripts\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.440804 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.440832 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.441648 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-dns-svc\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.441822 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn72b\" (UniqueName: \"kubernetes.io/projected/6bb656eb-1eea-436d-acf3-6d8a548a97e5-kube-api-access-jn72b\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.441869 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-config\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.441913 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.441935 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.441978 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5q2x\" (UniqueName: \"kubernetes.io/projected/31d3ee22-9b3b-46ac-b896-ba5c521e1753-kube-api-access-m5q2x\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.442015 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb656eb-1eea-436d-acf3-6d8a548a97e5-config\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.442044 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.442100 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.442747 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-config\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.443272 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-sb\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.444601 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-nb\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.460024 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5q2x\" (UniqueName: \"kubernetes.io/projected/31d3ee22-9b3b-46ac-b896-ba5c521e1753-kube-api-access-m5q2x\") pod \"dnsmasq-dns-757dc6fff9-t2sfn\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.547235 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.547582 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn72b\" (UniqueName: \"kubernetes.io/projected/6bb656eb-1eea-436d-acf3-6d8a548a97e5-kube-api-access-jn72b\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.547603 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.547662 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb656eb-1eea-436d-acf3-6d8a548a97e5-config\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.547721 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.547759 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6bb656eb-1eea-436d-acf3-6d8a548a97e5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.547788 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bb656eb-1eea-436d-acf3-6d8a548a97e5-scripts\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.548558 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6bb656eb-1eea-436d-acf3-6d8a548a97e5-scripts\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.551080 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bb656eb-1eea-436d-acf3-6d8a548a97e5-config\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.558193 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6bb656eb-1eea-436d-acf3-6d8a548a97e5-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.559582 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.560101 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.560174 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6bb656eb-1eea-436d-acf3-6d8a548a97e5-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.569187 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn72b\" (UniqueName: \"kubernetes.io/projected/6bb656eb-1eea-436d-acf3-6d8a548a97e5-kube-api-access-jn72b\") pod \"ovn-northd-0\" (UID: \"6bb656eb-1eea-436d-acf3-6d8a548a97e5\") " pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.674809 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.740142 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.741632 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.741668 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.804620 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8s5p8"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.808388 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.824607 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8s5p8"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.855754 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7c9q\" (UniqueName: \"kubernetes.io/projected/a6b48db0-1768-4940-9e42-0362374c7358-kube-api-access-d7c9q\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.855911 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-catalog-content\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.855933 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-utilities\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.885219 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kgvb6"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.928436 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9xncr"] Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.957615 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7c9q\" (UniqueName: \"kubernetes.io/projected/a6b48db0-1768-4940-9e42-0362374c7358-kube-api-access-d7c9q\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.957842 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-catalog-content\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.957882 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-utilities\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.958394 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-utilities\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.958997 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-catalog-content\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:16 crc kubenswrapper[4775]: I0127 11:36:16.977614 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7c9q\" (UniqueName: \"kubernetes.io/projected/a6b48db0-1768-4940-9e42-0362374c7358-kube-api-access-d7c9q\") pod \"community-operators-8s5p8\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.142497 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.178966 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-t2sfn"] Jan 27 11:36:17 crc kubenswrapper[4775]: W0127 11:36:17.196749 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31d3ee22_9b3b_46ac_b896_ba5c521e1753.slice/crio-b21b311b130da3440a7a2e7074ea7f07554bbb6a824125778029cf4c67436a28 WatchSource:0}: Error finding container b21b311b130da3440a7a2e7074ea7f07554bbb6a824125778029cf4c67436a28: Status 404 returned error can't find the container with id b21b311b130da3440a7a2e7074ea7f07554bbb6a824125778029cf4c67436a28 Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.356837 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.613299 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8s5p8"] Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.613762 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.773972 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" event={"ID":"31d3ee22-9b3b-46ac-b896-ba5c521e1753","Type":"ContainerStarted","Data":"eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.774010 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" event={"ID":"31d3ee22-9b3b-46ac-b896-ba5c521e1753","Type":"ContainerStarted","Data":"b21b311b130da3440a7a2e7074ea7f07554bbb6a824125778029cf4c67436a28"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.777357 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9xncr" event={"ID":"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5","Type":"ContainerStarted","Data":"298409a96c786441a60df4dd9efea24a8e3bca3bb653c04e5c0a57d1df204821"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.777380 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9xncr" event={"ID":"7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5","Type":"ContainerStarted","Data":"0c8e62c1a291e0de5a79d6048e73580e36b94e4698e9d98bcf379972996a6c37"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.780174 4775 generic.go:334] "Generic (PLEG): container finished" podID="de956838-03d3-41d8-96d3-a85293eff207" containerID="2e81b0c9e712aea129261264566fcfb84e7c96b384aef515c9ba092ed6df8a8f" exitCode=0 Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.780224 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" event={"ID":"de956838-03d3-41d8-96d3-a85293eff207","Type":"ContainerDied","Data":"2e81b0c9e712aea129261264566fcfb84e7c96b384aef515c9ba092ed6df8a8f"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.780249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" event={"ID":"de956838-03d3-41d8-96d3-a85293eff207","Type":"ContainerStarted","Data":"fd5b11d9815172e4b8d84472d53d5f5c5a67656114e7d95c656d684b7f601224"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.782914 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6bb656eb-1eea-436d-acf3-6d8a548a97e5","Type":"ContainerStarted","Data":"2bf60354b1e345338bcc540ea18be851719d1616f6dbd39f82c6fe1a3f139081"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.797940 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-9xncr" podStartSLOduration=1.7979232920000001 podStartE2EDuration="1.797923292s" podCreationTimestamp="2026-01-27 11:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:17.793146141 +0000 UTC m=+956.934743928" watchObservedRunningTime="2026-01-27 11:36:17.797923292 +0000 UTC m=+956.939521069" Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.803562 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s5p8" event={"ID":"a6b48db0-1768-4940-9e42-0362374c7358","Type":"ContainerStarted","Data":"f2f98601e1fc4d2ec97f9e0c70c2dbd57bb16d6fdfa6d9ac4a20a475acb21242"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.811833 4775 generic.go:334] "Generic (PLEG): container finished" podID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerID="8be4620fe03275aaa5212ee572a3a0d887cfc63a9b8b6239245c1bca75f7d04b" exitCode=0 Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.811899 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" event={"ID":"a0ffffa8-8199-4d59-927b-5563eda147fd","Type":"ContainerDied","Data":"8be4620fe03275aaa5212ee572a3a0d887cfc63a9b8b6239245c1bca75f7d04b"} Jan 27 11:36:17 crc kubenswrapper[4775]: I0127 11:36:17.813265 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" podUID="c7196167-1cda-485b-9bec-36ab0e666568" containerName="dnsmasq-dns" containerID="cri-o://ab7d80585c73c2935a1546f42ec8127d8f07e4ebfcf89fc16e590bf9f313fdc3" gracePeriod=10 Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.099068 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.187162 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkn2z\" (UniqueName: \"kubernetes.io/projected/a0ffffa8-8199-4d59-927b-5563eda147fd-kube-api-access-qkn2z\") pod \"a0ffffa8-8199-4d59-927b-5563eda147fd\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.187274 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-dns-svc\") pod \"a0ffffa8-8199-4d59-927b-5563eda147fd\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.187318 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-config\") pod \"a0ffffa8-8199-4d59-927b-5563eda147fd\" (UID: \"a0ffffa8-8199-4d59-927b-5563eda147fd\") " Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.193721 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0ffffa8-8199-4d59-927b-5563eda147fd-kube-api-access-qkn2z" (OuterVolumeSpecName: "kube-api-access-qkn2z") pod "a0ffffa8-8199-4d59-927b-5563eda147fd" (UID: "a0ffffa8-8199-4d59-927b-5563eda147fd"). InnerVolumeSpecName "kube-api-access-qkn2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.232147 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a0ffffa8-8199-4d59-927b-5563eda147fd" (UID: "a0ffffa8-8199-4d59-927b-5563eda147fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.232993 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-config" (OuterVolumeSpecName: "config") pod "a0ffffa8-8199-4d59-927b-5563eda147fd" (UID: "a0ffffa8-8199-4d59-927b-5563eda147fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.294500 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.294553 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ffffa8-8199-4d59-927b-5563eda147fd-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.294566 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkn2z\" (UniqueName: \"kubernetes.io/projected/a0ffffa8-8199-4d59-927b-5563eda147fd-kube-api-access-qkn2z\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.584820 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.585042 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.661088 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.821307 4775 generic.go:334] "Generic (PLEG): container finished" podID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerID="eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9" exitCode=0 Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.821406 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" event={"ID":"31d3ee22-9b3b-46ac-b896-ba5c521e1753","Type":"ContainerDied","Data":"eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9"} Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.824400 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" event={"ID":"de956838-03d3-41d8-96d3-a85293eff207","Type":"ContainerStarted","Data":"be03cf8dd9b8b4d759b45f66de69e36302203158f35e88be1b7e33246324a38c"} Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.824501 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.826308 4775 generic.go:334] "Generic (PLEG): container finished" podID="c7196167-1cda-485b-9bec-36ab0e666568" containerID="ab7d80585c73c2935a1546f42ec8127d8f07e4ebfcf89fc16e590bf9f313fdc3" exitCode=0 Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.826335 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" event={"ID":"c7196167-1cda-485b-9bec-36ab0e666568","Type":"ContainerDied","Data":"ab7d80585c73c2935a1546f42ec8127d8f07e4ebfcf89fc16e590bf9f313fdc3"} Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.826424 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" event={"ID":"c7196167-1cda-485b-9bec-36ab0e666568","Type":"ContainerDied","Data":"23b427342d95b6e773ad84c1894a7f982f371e2925c24ce5c4881c1467a1c55e"} Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.826441 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23b427342d95b6e773ad84c1894a7f982f371e2925c24ce5c4881c1467a1c55e" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.828472 4775 generic.go:334] "Generic (PLEG): container finished" podID="a6b48db0-1768-4940-9e42-0362374c7358" containerID="dfcc40044b419ee03e79042d3f7fccf98f28c41e9f9431de67dcc1968ec91051" exitCode=0 Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.828520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s5p8" event={"ID":"a6b48db0-1768-4940-9e42-0362374c7358","Type":"ContainerDied","Data":"dfcc40044b419ee03e79042d3f7fccf98f28c41e9f9431de67dcc1968ec91051"} Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.829651 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.830127 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" event={"ID":"a0ffffa8-8199-4d59-927b-5563eda147fd","Type":"ContainerDied","Data":"3b01344aca1f47a063297fbb9583a3d253a510b103b9f161a6d9fe9205de60d6"} Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.830178 4775 scope.go:117] "RemoveContainer" containerID="8be4620fe03275aaa5212ee572a3a0d887cfc63a9b8b6239245c1bca75f7d04b" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.830284 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95f5f6995-bzxbb" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.860316 4775 scope.go:117] "RemoveContainer" containerID="357b113e0ab8b0acfafd5e8a4b10ed58eb7061e0cf48c4acf628d4887c7b99da" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.868373 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" podStartSLOduration=2.868354615 podStartE2EDuration="2.868354615s" podCreationTimestamp="2026-01-27 11:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:18.863073402 +0000 UTC m=+958.004671179" watchObservedRunningTime="2026-01-27 11:36:18.868354615 +0000 UTC m=+958.009952392" Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.923894 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-bzxbb"] Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.930514 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95f5f6995-bzxbb"] Jan 27 11:36:18 crc kubenswrapper[4775]: I0127 11:36:18.968376 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.008020 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9dg2\" (UniqueName: \"kubernetes.io/projected/c7196167-1cda-485b-9bec-36ab0e666568-kube-api-access-d9dg2\") pod \"c7196167-1cda-485b-9bec-36ab0e666568\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.008294 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-dns-svc\") pod \"c7196167-1cda-485b-9bec-36ab0e666568\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.008503 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-config\") pod \"c7196167-1cda-485b-9bec-36ab0e666568\" (UID: \"c7196167-1cda-485b-9bec-36ab0e666568\") " Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.016004 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7196167-1cda-485b-9bec-36ab0e666568-kube-api-access-d9dg2" (OuterVolumeSpecName: "kube-api-access-d9dg2") pod "c7196167-1cda-485b-9bec-36ab0e666568" (UID: "c7196167-1cda-485b-9bec-36ab0e666568"). InnerVolumeSpecName "kube-api-access-d9dg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.063103 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-config" (OuterVolumeSpecName: "config") pod "c7196167-1cda-485b-9bec-36ab0e666568" (UID: "c7196167-1cda-485b-9bec-36ab0e666568"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.063111 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c7196167-1cda-485b-9bec-36ab0e666568" (UID: "c7196167-1cda-485b-9bec-36ab0e666568"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.110176 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.110201 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9dg2\" (UniqueName: \"kubernetes.io/projected/c7196167-1cda-485b-9bec-36ab0e666568-kube-api-access-d9dg2\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.110212 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c7196167-1cda-485b-9bec-36ab0e666568-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.401144 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t2tfh"] Jan 27 11:36:19 crc kubenswrapper[4775]: E0127 11:36:19.401606 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerName="init" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.401627 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerName="init" Jan 27 11:36:19 crc kubenswrapper[4775]: E0127 11:36:19.401649 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerName="dnsmasq-dns" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.401658 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerName="dnsmasq-dns" Jan 27 11:36:19 crc kubenswrapper[4775]: E0127 11:36:19.401674 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7196167-1cda-485b-9bec-36ab0e666568" containerName="dnsmasq-dns" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.401681 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7196167-1cda-485b-9bec-36ab0e666568" containerName="dnsmasq-dns" Jan 27 11:36:19 crc kubenswrapper[4775]: E0127 11:36:19.401703 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7196167-1cda-485b-9bec-36ab0e666568" containerName="init" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.401710 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7196167-1cda-485b-9bec-36ab0e666568" containerName="init" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.401902 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7196167-1cda-485b-9bec-36ab0e666568" containerName="dnsmasq-dns" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.401922 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" containerName="dnsmasq-dns" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.403415 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.423705 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t2tfh"] Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.517251 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cjdj\" (UniqueName: \"kubernetes.io/projected/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-kube-api-access-5cjdj\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.517625 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-catalog-content\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.517728 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-utilities\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.619019 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cjdj\" (UniqueName: \"kubernetes.io/projected/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-kube-api-access-5cjdj\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.619090 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-catalog-content\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.619159 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-utilities\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.619868 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-utilities\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.619865 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-catalog-content\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.644015 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cjdj\" (UniqueName: \"kubernetes.io/projected/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-kube-api-access-5cjdj\") pod \"redhat-operators-t2tfh\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.727723 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.757393 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0ffffa8-8199-4d59-927b-5563eda147fd" path="/var/lib/kubelet/pods/a0ffffa8-8199-4d59-927b-5563eda147fd/volumes" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.855917 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" event={"ID":"31d3ee22-9b3b-46ac-b896-ba5c521e1753","Type":"ContainerStarted","Data":"8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3"} Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.856302 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.858014 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6bb656eb-1eea-436d-acf3-6d8a548a97e5","Type":"ContainerStarted","Data":"27eec8f7676cb37dd7daff05707969bb0fd08a19b54b71f6d798ce512808ccdd"} Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.858042 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6bb656eb-1eea-436d-acf3-6d8a548a97e5","Type":"ContainerStarted","Data":"f95cf09a61434fbcb9c78b8a19d1d14e6adc70a9c67bd12d89c862c967841ad5"} Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.858853 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.863617 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s5p8" event={"ID":"a6b48db0-1768-4940-9e42-0362374c7358","Type":"ContainerStarted","Data":"1f7c49ce837d6dbb266165ca63e898a0dc5b0872cf3564463905319d62ce7b1b"} Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.866464 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-744ffd65bc-4xzdj" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.879679 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" podStartSLOduration=3.879664415 podStartE2EDuration="3.879664415s" podCreationTimestamp="2026-01-27 11:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:19.877270549 +0000 UTC m=+959.018868326" watchObservedRunningTime="2026-01-27 11:36:19.879664415 +0000 UTC m=+959.021262192" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.899682 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.546321941 podStartE2EDuration="3.899664321s" podCreationTimestamp="2026-01-27 11:36:16 +0000 UTC" firstStartedPulling="2026-01-27 11:36:17.349511836 +0000 UTC m=+956.491109613" lastFinishedPulling="2026-01-27 11:36:18.702854216 +0000 UTC m=+957.844451993" observedRunningTime="2026-01-27 11:36:19.893218884 +0000 UTC m=+959.034816671" watchObservedRunningTime="2026-01-27 11:36:19.899664321 +0000 UTC m=+959.041262098" Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.939117 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-4xzdj"] Jan 27 11:36:19 crc kubenswrapper[4775]: I0127 11:36:19.956975 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-744ffd65bc-4xzdj"] Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.204832 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t2tfh"] Jan 27 11:36:20 crc kubenswrapper[4775]: W0127 11:36:20.207527 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6330ccb9_6a5a_42d6_8c0f_b3c395b867a0.slice/crio-99f3348700fb94d370f10d24c119a738256680dc0ee1f38c4d297c9772b690ab WatchSource:0}: Error finding container 99f3348700fb94d370f10d24c119a738256680dc0ee1f38c4d297c9772b690ab: Status 404 returned error can't find the container with id 99f3348700fb94d370f10d24c119a738256680dc0ee1f38c4d297c9772b690ab Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.505999 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.727967 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.872193 4775 generic.go:334] "Generic (PLEG): container finished" podID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerID="af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a" exitCode=0 Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.872244 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2tfh" event={"ID":"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0","Type":"ContainerDied","Data":"af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a"} Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.872293 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2tfh" event={"ID":"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0","Type":"ContainerStarted","Data":"99f3348700fb94d370f10d24c119a738256680dc0ee1f38c4d297c9772b690ab"} Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.874186 4775 generic.go:334] "Generic (PLEG): container finished" podID="a6b48db0-1768-4940-9e42-0362374c7358" containerID="1f7c49ce837d6dbb266165ca63e898a0dc5b0872cf3564463905319d62ce7b1b" exitCode=0 Jan 27 11:36:20 crc kubenswrapper[4775]: I0127 11:36:20.874263 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s5p8" event={"ID":"a6b48db0-1768-4940-9e42-0362374c7358","Type":"ContainerDied","Data":"1f7c49ce837d6dbb266165ca63e898a0dc5b0872cf3564463905319d62ce7b1b"} Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.012632 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.012946 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.126175 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jz4kw"] Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.127109 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.129165 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.139648 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jz4kw"] Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.158098 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.179141 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4p9c"] Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.179406 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c4p9c" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="registry-server" containerID="cri-o://6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c" gracePeriod=2 Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.281226 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba97a22f-b5dd-4289-bb3b-39578c05f231-operator-scripts\") pod \"root-account-create-update-jz4kw\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.281551 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cv6w\" (UniqueName: \"kubernetes.io/projected/ba97a22f-b5dd-4289-bb3b-39578c05f231-kube-api-access-4cv6w\") pod \"root-account-create-update-jz4kw\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.383644 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba97a22f-b5dd-4289-bb3b-39578c05f231-operator-scripts\") pod \"root-account-create-update-jz4kw\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.383716 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cv6w\" (UniqueName: \"kubernetes.io/projected/ba97a22f-b5dd-4289-bb3b-39578c05f231-kube-api-access-4cv6w\") pod \"root-account-create-update-jz4kw\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.384668 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba97a22f-b5dd-4289-bb3b-39578c05f231-operator-scripts\") pod \"root-account-create-update-jz4kw\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.410281 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cv6w\" (UniqueName: \"kubernetes.io/projected/ba97a22f-b5dd-4289-bb3b-39578c05f231-kube-api-access-4cv6w\") pod \"root-account-create-update-jz4kw\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.443802 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.557386 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.557819 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.617919 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.652548 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.792056 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-catalog-content\") pod \"32c63ae2-f837-485f-9f74-0606288c3666\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.792555 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpgqk\" (UniqueName: \"kubernetes.io/projected/32c63ae2-f837-485f-9f74-0606288c3666-kube-api-access-vpgqk\") pod \"32c63ae2-f837-485f-9f74-0606288c3666\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.792601 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-utilities\") pod \"32c63ae2-f837-485f-9f74-0606288c3666\" (UID: \"32c63ae2-f837-485f-9f74-0606288c3666\") " Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.792772 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7196167-1cda-485b-9bec-36ab0e666568" path="/var/lib/kubelet/pods/c7196167-1cda-485b-9bec-36ab0e666568/volumes" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.793798 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-utilities" (OuterVolumeSpecName: "utilities") pod "32c63ae2-f837-485f-9f74-0606288c3666" (UID: "32c63ae2-f837-485f-9f74-0606288c3666"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.801708 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32c63ae2-f837-485f-9f74-0606288c3666-kube-api-access-vpgqk" (OuterVolumeSpecName: "kube-api-access-vpgqk") pod "32c63ae2-f837-485f-9f74-0606288c3666" (UID: "32c63ae2-f837-485f-9f74-0606288c3666"). InnerVolumeSpecName "kube-api-access-vpgqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.845136 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "32c63ae2-f837-485f-9f74-0606288c3666" (UID: "32c63ae2-f837-485f-9f74-0606288c3666"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.882915 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2tfh" event={"ID":"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0","Type":"ContainerStarted","Data":"96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e"} Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.886014 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s5p8" event={"ID":"a6b48db0-1768-4940-9e42-0362374c7358","Type":"ContainerStarted","Data":"980f0264e345cdfb3b6f590b6db854bc469b4b363ed97dbbe2f3f3ceba904a42"} Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.889438 4775 generic.go:334] "Generic (PLEG): container finished" podID="32c63ae2-f837-485f-9f74-0606288c3666" containerID="6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c" exitCode=0 Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.890070 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c4p9c" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.891634 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4p9c" event={"ID":"32c63ae2-f837-485f-9f74-0606288c3666","Type":"ContainerDied","Data":"6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c"} Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.891694 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c4p9c" event={"ID":"32c63ae2-f837-485f-9f74-0606288c3666","Type":"ContainerDied","Data":"3c9369265622ba39ffc877111bca425ea61080e3b7bb1ee8ddc44e387299ce63"} Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.891718 4775 scope.go:117] "RemoveContainer" containerID="6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.893721 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.893741 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/32c63ae2-f837-485f-9f74-0606288c3666-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.893753 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpgqk\" (UniqueName: \"kubernetes.io/projected/32c63ae2-f837-485f-9f74-0606288c3666-kube-api-access-vpgqk\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.930586 4775 scope.go:117] "RemoveContainer" containerID="ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.943884 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8s5p8" podStartSLOduration=3.510914934 podStartE2EDuration="5.943866428s" podCreationTimestamp="2026-01-27 11:36:16 +0000 UTC" firstStartedPulling="2026-01-27 11:36:18.830025079 +0000 UTC m=+957.971622856" lastFinishedPulling="2026-01-27 11:36:21.262976563 +0000 UTC m=+960.404574350" observedRunningTime="2026-01-27 11:36:21.920294314 +0000 UTC m=+961.061892091" watchObservedRunningTime="2026-01-27 11:36:21.943866428 +0000 UTC m=+961.085464205" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.944816 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4p9c"] Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.951317 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c4p9c"] Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.955281 4775 scope.go:117] "RemoveContainer" containerID="8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.961034 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.981414 4775 scope.go:117] "RemoveContainer" containerID="6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c" Jan 27 11:36:21 crc kubenswrapper[4775]: E0127 11:36:21.982043 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c\": container with ID starting with 6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c not found: ID does not exist" containerID="6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.982077 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c"} err="failed to get container status \"6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c\": rpc error: code = NotFound desc = could not find container \"6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c\": container with ID starting with 6d0f69ae68f5a8e093c6972dea6250ba3541c13443a8dbe5c62fa107f993672c not found: ID does not exist" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.982096 4775 scope.go:117] "RemoveContainer" containerID="ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c" Jan 27 11:36:21 crc kubenswrapper[4775]: E0127 11:36:21.982479 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c\": container with ID starting with ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c not found: ID does not exist" containerID="ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.982619 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c"} err="failed to get container status \"ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c\": rpc error: code = NotFound desc = could not find container \"ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c\": container with ID starting with ce0d836f765294eaef47ae688f272637d2522ed91278a49d749323c8ce914b9c not found: ID does not exist" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.982654 4775 scope.go:117] "RemoveContainer" containerID="8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5" Jan 27 11:36:21 crc kubenswrapper[4775]: E0127 11:36:21.983695 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5\": container with ID starting with 8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5 not found: ID does not exist" containerID="8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.983724 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5"} err="failed to get container status \"8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5\": rpc error: code = NotFound desc = could not find container \"8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5\": container with ID starting with 8e22a5ed7b825b0ce71c0bd165b64c87509a31bee8b6414d79fc118aee54a1a5 not found: ID does not exist" Jan 27 11:36:21 crc kubenswrapper[4775]: I0127 11:36:21.988007 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.004507 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jz4kw"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.018975 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.113822 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-z98pk"] Jan 27 11:36:22 crc kubenswrapper[4775]: E0127 11:36:22.114127 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="extract-content" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.114140 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="extract-content" Jan 27 11:36:22 crc kubenswrapper[4775]: E0127 11:36:22.114150 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="registry-server" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.114155 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="registry-server" Jan 27 11:36:22 crc kubenswrapper[4775]: E0127 11:36:22.114164 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="extract-utilities" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.114170 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="extract-utilities" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.114302 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="32c63ae2-f837-485f-9f74-0606288c3666" containerName="registry-server" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.114787 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.125399 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-z98pk"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.197089 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgcjk\" (UniqueName: \"kubernetes.io/projected/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-kube-api-access-cgcjk\") pod \"keystone-db-create-z98pk\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.197394 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-operator-scripts\") pod \"keystone-db-create-z98pk\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.232169 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-2856-account-create-update-zgmqw"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.233341 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.235463 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.245915 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2856-account-create-update-zgmqw"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.306461 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-operator-scripts\") pod \"keystone-db-create-z98pk\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.306827 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgcjk\" (UniqueName: \"kubernetes.io/projected/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-kube-api-access-cgcjk\") pod \"keystone-db-create-z98pk\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.307477 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-operator-scripts\") pod \"keystone-db-create-z98pk\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.327054 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgcjk\" (UniqueName: \"kubernetes.io/projected/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-kube-api-access-cgcjk\") pod \"keystone-db-create-z98pk\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.411214 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbde61d-aca8-4b36-8896-9c0db3e081be-operator-scripts\") pod \"keystone-2856-account-create-update-zgmqw\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.411329 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrcgx\" (UniqueName: \"kubernetes.io/projected/0bbde61d-aca8-4b36-8896-9c0db3e081be-kube-api-access-jrcgx\") pod \"keystone-2856-account-create-update-zgmqw\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.424796 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-m5645"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.425789 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.432865 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m5645"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.434327 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.513061 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rfmh\" (UniqueName: \"kubernetes.io/projected/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-kube-api-access-5rfmh\") pod \"placement-db-create-m5645\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.513414 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrcgx\" (UniqueName: \"kubernetes.io/projected/0bbde61d-aca8-4b36-8896-9c0db3e081be-kube-api-access-jrcgx\") pod \"keystone-2856-account-create-update-zgmqw\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.513572 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbde61d-aca8-4b36-8896-9c0db3e081be-operator-scripts\") pod \"keystone-2856-account-create-update-zgmqw\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.513701 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-operator-scripts\") pod \"placement-db-create-m5645\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.515218 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbde61d-aca8-4b36-8896-9c0db3e081be-operator-scripts\") pod \"keystone-2856-account-create-update-zgmqw\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.532021 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrcgx\" (UniqueName: \"kubernetes.io/projected/0bbde61d-aca8-4b36-8896-9c0db3e081be-kube-api-access-jrcgx\") pod \"keystone-2856-account-create-update-zgmqw\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.545768 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.547111 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8d1f-account-create-update-gbh56"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.548322 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.567229 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.594080 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8d1f-account-create-update-gbh56"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.615016 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rfmh\" (UniqueName: \"kubernetes.io/projected/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-kube-api-access-5rfmh\") pod \"placement-db-create-m5645\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.615150 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-operator-scripts\") pod \"placement-db-create-m5645\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.616189 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-operator-scripts\") pod \"placement-db-create-m5645\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.632335 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rfmh\" (UniqueName: \"kubernetes.io/projected/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-kube-api-access-5rfmh\") pod \"placement-db-create-m5645\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.717105 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d04bb6-3007-42c5-9753-746a6eeb7d1c-operator-scripts\") pod \"placement-8d1f-account-create-update-gbh56\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.717174 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66pxc\" (UniqueName: \"kubernetes.io/projected/24d04bb6-3007-42c5-9753-746a6eeb7d1c-kube-api-access-66pxc\") pod \"placement-8d1f-account-create-update-gbh56\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.770043 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m5645" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.818379 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d04bb6-3007-42c5-9753-746a6eeb7d1c-operator-scripts\") pod \"placement-8d1f-account-create-update-gbh56\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.818444 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66pxc\" (UniqueName: \"kubernetes.io/projected/24d04bb6-3007-42c5-9753-746a6eeb7d1c-kube-api-access-66pxc\") pod \"placement-8d1f-account-create-update-gbh56\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.819105 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d04bb6-3007-42c5-9753-746a6eeb7d1c-operator-scripts\") pod \"placement-8d1f-account-create-update-gbh56\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.835127 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66pxc\" (UniqueName: \"kubernetes.io/projected/24d04bb6-3007-42c5-9753-746a6eeb7d1c-kube-api-access-66pxc\") pod \"placement-8d1f-account-create-update-gbh56\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.905461 4775 generic.go:334] "Generic (PLEG): container finished" podID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerID="96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e" exitCode=0 Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.905545 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2tfh" event={"ID":"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0","Type":"ContainerDied","Data":"96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e"} Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.907819 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-z98pk"] Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.909572 4775 generic.go:334] "Generic (PLEG): container finished" podID="ba97a22f-b5dd-4289-bb3b-39578c05f231" containerID="8350f8998d5c2b4d38b2c37a8ef1d6f2931c0920b4400f0d9585d7221601d93d" exitCode=0 Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.909806 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jz4kw" event={"ID":"ba97a22f-b5dd-4289-bb3b-39578c05f231","Type":"ContainerDied","Data":"8350f8998d5c2b4d38b2c37a8ef1d6f2931c0920b4400f0d9585d7221601d93d"} Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.909833 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jz4kw" event={"ID":"ba97a22f-b5dd-4289-bb3b-39578c05f231","Type":"ContainerStarted","Data":"1c09ae16eb619392d5985f913efdb766e18ba5d62f5ec6ed6694b2a7ac8efb68"} Jan 27 11:36:22 crc kubenswrapper[4775]: I0127 11:36:22.960893 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.018342 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-2856-account-create-update-zgmqw"] Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.198661 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-m5645"] Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.448453 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8d1f-account-create-update-gbh56"] Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.764493 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32c63ae2-f837-485f-9f74-0606288c3666" path="/var/lib/kubelet/pods/32c63ae2-f837-485f-9f74-0606288c3666/volumes" Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.925224 4775 generic.go:334] "Generic (PLEG): container finished" podID="f53ed1d7-9aa1-49d4-8396-c3487e0465d6" containerID="7b4d6f31c9c98ba053d3d16dc4c80a54a02b6f5c6992d3e72b61e7cfc30b58ab" exitCode=0 Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.925372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z98pk" event={"ID":"f53ed1d7-9aa1-49d4-8396-c3487e0465d6","Type":"ContainerDied","Data":"7b4d6f31c9c98ba053d3d16dc4c80a54a02b6f5c6992d3e72b61e7cfc30b58ab"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.925425 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z98pk" event={"ID":"f53ed1d7-9aa1-49d4-8396-c3487e0465d6","Type":"ContainerStarted","Data":"8e088adb7960ae079bd4ae17a34932aaa8b20d59a3740ef813f88d0598ace2ad"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.926627 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2856-account-create-update-zgmqw" event={"ID":"0bbde61d-aca8-4b36-8896-9c0db3e081be","Type":"ContainerStarted","Data":"25331384137e51f62cf5d50c569a969c7570079d48885c44122b0593afae0e9e"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.926661 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2856-account-create-update-zgmqw" event={"ID":"0bbde61d-aca8-4b36-8896-9c0db3e081be","Type":"ContainerStarted","Data":"2f4f48c0c35388742479c887fb4079df26a0d51b18d1878461a20933bd575635"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.932975 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d1f-account-create-update-gbh56" event={"ID":"24d04bb6-3007-42c5-9753-746a6eeb7d1c","Type":"ContainerStarted","Data":"a7104b478c78a88190582a427d9e420a454c991055e729bc5832a8bcf5f244d9"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.933007 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d1f-account-create-update-gbh56" event={"ID":"24d04bb6-3007-42c5-9753-746a6eeb7d1c","Type":"ContainerStarted","Data":"df80ece7ab0fd17c0d9c7e70ac47be4aea20f8011f1d38b81535074ba3cc4622"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.938982 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2tfh" event={"ID":"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0","Type":"ContainerStarted","Data":"8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.940727 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m5645" event={"ID":"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d","Type":"ContainerStarted","Data":"3b69c86674facf450b3f60f67ef713811fbc5e3c9c84c0321b56c4b870189985"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.940782 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m5645" event={"ID":"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d","Type":"ContainerStarted","Data":"5923ba32098b5e082d1f4b2d5b1afb7d403212251b636753e0cd847905ffc64f"} Jan 27 11:36:23 crc kubenswrapper[4775]: I0127 11:36:23.970074 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-2856-account-create-update-zgmqw" podStartSLOduration=1.970050723 podStartE2EDuration="1.970050723s" podCreationTimestamp="2026-01-27 11:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:23.964669016 +0000 UTC m=+963.106266813" watchObservedRunningTime="2026-01-27 11:36:23.970050723 +0000 UTC m=+963.111648520" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.296639 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.347944 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cv6w\" (UniqueName: \"kubernetes.io/projected/ba97a22f-b5dd-4289-bb3b-39578c05f231-kube-api-access-4cv6w\") pod \"ba97a22f-b5dd-4289-bb3b-39578c05f231\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.348117 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba97a22f-b5dd-4289-bb3b-39578c05f231-operator-scripts\") pod \"ba97a22f-b5dd-4289-bb3b-39578c05f231\" (UID: \"ba97a22f-b5dd-4289-bb3b-39578c05f231\") " Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.348711 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba97a22f-b5dd-4289-bb3b-39578c05f231-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba97a22f-b5dd-4289-bb3b-39578c05f231" (UID: "ba97a22f-b5dd-4289-bb3b-39578c05f231"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.359709 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba97a22f-b5dd-4289-bb3b-39578c05f231-kube-api-access-4cv6w" (OuterVolumeSpecName: "kube-api-access-4cv6w") pod "ba97a22f-b5dd-4289-bb3b-39578c05f231" (UID: "ba97a22f-b5dd-4289-bb3b-39578c05f231"). InnerVolumeSpecName "kube-api-access-4cv6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.449495 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cv6w\" (UniqueName: \"kubernetes.io/projected/ba97a22f-b5dd-4289-bb3b-39578c05f231-kube-api-access-4cv6w\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.449529 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba97a22f-b5dd-4289-bb3b-39578c05f231-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.675462 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.784205 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kgvb6"] Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.784451 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" podUID="de956838-03d3-41d8-96d3-a85293eff207" containerName="dnsmasq-dns" containerID="cri-o://be03cf8dd9b8b4d759b45f66de69e36302203158f35e88be1b7e33246324a38c" gracePeriod=10 Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.786193 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.811457 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-xrw7x"] Jan 27 11:36:24 crc kubenswrapper[4775]: E0127 11:36:24.811777 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba97a22f-b5dd-4289-bb3b-39578c05f231" containerName="mariadb-account-create-update" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.811793 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba97a22f-b5dd-4289-bb3b-39578c05f231" containerName="mariadb-account-create-update" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.811942 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba97a22f-b5dd-4289-bb3b-39578c05f231" containerName="mariadb-account-create-update" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.812728 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.834782 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-xrw7x"] Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.949872 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jz4kw" event={"ID":"ba97a22f-b5dd-4289-bb3b-39578c05f231","Type":"ContainerDied","Data":"1c09ae16eb619392d5985f913efdb766e18ba5d62f5ec6ed6694b2a7ac8efb68"} Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.949914 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jz4kw" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.949908 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c09ae16eb619392d5985f913efdb766e18ba5d62f5ec6ed6694b2a7ac8efb68" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.959594 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.959653 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-config\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.959681 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.959727 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.959789 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g82sd\" (UniqueName: \"kubernetes.io/projected/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-kube-api-access-g82sd\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.976721 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-8d1f-account-create-update-gbh56" podStartSLOduration=2.976702854 podStartE2EDuration="2.976702854s" podCreationTimestamp="2026-01-27 11:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:24.970071973 +0000 UTC m=+964.111669750" watchObservedRunningTime="2026-01-27 11:36:24.976702854 +0000 UTC m=+964.118300631" Jan 27 11:36:24 crc kubenswrapper[4775]: I0127 11:36:24.985667 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t2tfh" podStartSLOduration=3.542734202 podStartE2EDuration="5.985651168s" podCreationTimestamp="2026-01-27 11:36:19 +0000 UTC" firstStartedPulling="2026-01-27 11:36:20.87328792 +0000 UTC m=+960.014885697" lastFinishedPulling="2026-01-27 11:36:23.316204886 +0000 UTC m=+962.457802663" observedRunningTime="2026-01-27 11:36:24.985186706 +0000 UTC m=+964.126784483" watchObservedRunningTime="2026-01-27 11:36:24.985651168 +0000 UTC m=+964.127248945" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.004547 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-m5645" podStartSLOduration=3.004529794 podStartE2EDuration="3.004529794s" podCreationTimestamp="2026-01-27 11:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:25.002827758 +0000 UTC m=+964.144425545" watchObservedRunningTime="2026-01-27 11:36:25.004529794 +0000 UTC m=+964.146127571" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.062858 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g82sd\" (UniqueName: \"kubernetes.io/projected/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-kube-api-access-g82sd\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.062920 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.062967 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-config\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.063009 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.063079 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.064132 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-sb\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.067741 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-config\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.067922 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-nb\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.068317 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-dns-svc\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.085022 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g82sd\" (UniqueName: \"kubernetes.io/projected/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-kube-api-access-g82sd\") pod \"dnsmasq-dns-6cb545bd4c-xrw7x\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.132013 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.313251 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.368917 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgcjk\" (UniqueName: \"kubernetes.io/projected/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-kube-api-access-cgcjk\") pod \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.369347 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-operator-scripts\") pod \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\" (UID: \"f53ed1d7-9aa1-49d4-8396-c3487e0465d6\") " Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.370209 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f53ed1d7-9aa1-49d4-8396-c3487e0465d6" (UID: "f53ed1d7-9aa1-49d4-8396-c3487e0465d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.373990 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-kube-api-access-cgcjk" (OuterVolumeSpecName: "kube-api-access-cgcjk") pod "f53ed1d7-9aa1-49d4-8396-c3487e0465d6" (UID: "f53ed1d7-9aa1-49d4-8396-c3487e0465d6"). InnerVolumeSpecName "kube-api-access-cgcjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.471331 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgcjk\" (UniqueName: \"kubernetes.io/projected/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-kube-api-access-cgcjk\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.471358 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f53ed1d7-9aa1-49d4-8396-c3487e0465d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:25 crc kubenswrapper[4775]: W0127 11:36:25.685144 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc24ee1fa_0d6a_4ca1_b298_d876f473f8f8.slice/crio-28976e350fb8ecd8fa41a546d6bc48a308f3c35b6b458e7b2f0ad3f0838c3094 WatchSource:0}: Error finding container 28976e350fb8ecd8fa41a546d6bc48a308f3c35b6b458e7b2f0ad3f0838c3094: Status 404 returned error can't find the container with id 28976e350fb8ecd8fa41a546d6bc48a308f3c35b6b458e7b2f0ad3f0838c3094 Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.686659 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-xrw7x"] Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.781244 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hp57"] Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.781504 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2hp57" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="registry-server" containerID="cri-o://c77bc3df0ef278fa6252111f5fae5b862f83e4845c5181bed73e9f84cf00a7b4" gracePeriod=2 Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.960837 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-z98pk" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.960864 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-z98pk" event={"ID":"f53ed1d7-9aa1-49d4-8396-c3487e0465d6","Type":"ContainerDied","Data":"8e088adb7960ae079bd4ae17a34932aaa8b20d59a3740ef813f88d0598ace2ad"} Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.961493 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e088adb7960ae079bd4ae17a34932aaa8b20d59a3740ef813f88d0598ace2ad" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.966616 4775 generic.go:334] "Generic (PLEG): container finished" podID="de956838-03d3-41d8-96d3-a85293eff207" containerID="be03cf8dd9b8b4d759b45f66de69e36302203158f35e88be1b7e33246324a38c" exitCode=0 Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.966687 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" event={"ID":"de956838-03d3-41d8-96d3-a85293eff207","Type":"ContainerDied","Data":"be03cf8dd9b8b4d759b45f66de69e36302203158f35e88be1b7e33246324a38c"} Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.969700 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" event={"ID":"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8","Type":"ContainerStarted","Data":"28976e350fb8ecd8fa41a546d6bc48a308f3c35b6b458e7b2f0ad3f0838c3094"} Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.973459 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Jan 27 11:36:25 crc kubenswrapper[4775]: E0127 11:36:25.973899 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f53ed1d7-9aa1-49d4-8396-c3487e0465d6" containerName="mariadb-database-create" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.973924 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f53ed1d7-9aa1-49d4-8396-c3487e0465d6" containerName="mariadb-database-create" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.974124 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f53ed1d7-9aa1-49d4-8396-c3487e0465d6" containerName="mariadb-database-create" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.984802 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.987124 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.987204 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-xl5vv" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.989021 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.990930 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Jan 27 11:36:25 crc kubenswrapper[4775]: I0127 11:36:25.998509 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.081802 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-lock\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.081860 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.081886 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-cache\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.082087 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.082222 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.082297 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr99d\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-kube-api-access-rr99d\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184259 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-lock\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184329 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184353 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-cache\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184384 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184413 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184440 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr99d\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-kube-api-access-rr99d\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: E0127 11:36:26.184799 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 27 11:36:26 crc kubenswrapper[4775]: E0127 11:36:26.184829 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 27 11:36:26 crc kubenswrapper[4775]: E0127 11:36:26.184888 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift podName:b2f2b115-8dea-4dfa-a28e-5322f8fb8274 nodeName:}" failed. No retries permitted until 2026-01-27 11:36:26.684868769 +0000 UTC m=+965.826466536 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift") pod "swift-storage-0" (UID: "b2f2b115-8dea-4dfa-a28e-5322f8fb8274") : configmap "swift-ring-files" not found Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184941 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.184966 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-cache\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.185240 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-lock\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.191313 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.202421 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr99d\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-kube-api-access-rr99d\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.211146 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.348833 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" podUID="de956838-03d3-41d8-96d3-a85293eff207" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.106:5353: connect: connection refused" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.505187 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-7bdl6"] Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.508601 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.514212 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.514405 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.514593 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.518045 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7bdl6"] Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.593677 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-swiftconf\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.593947 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-ring-data-devices\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.594013 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-combined-ca-bundle\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.594136 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th5jx\" (UniqueName: \"kubernetes.io/projected/aa44a018-6958-4bee-895d-e7ec3966be8d-kube-api-access-th5jx\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.594153 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-dispersionconf\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.594297 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-scripts\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.594316 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aa44a018-6958-4bee-895d-e7ec3966be8d-etc-swift\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.677121 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.695724 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-ring-data-devices\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.695788 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-combined-ca-bundle\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.695833 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th5jx\" (UniqueName: \"kubernetes.io/projected/aa44a018-6958-4bee-895d-e7ec3966be8d-kube-api-access-th5jx\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.695854 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-dispersionconf\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.695916 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.695942 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-scripts\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.695965 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aa44a018-6958-4bee-895d-e7ec3966be8d-etc-swift\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.696029 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-swiftconf\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.696668 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-ring-data-devices\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: E0127 11:36:26.696780 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 27 11:36:26 crc kubenswrapper[4775]: E0127 11:36:26.696801 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 27 11:36:26 crc kubenswrapper[4775]: E0127 11:36:26.696841 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift podName:b2f2b115-8dea-4dfa-a28e-5322f8fb8274 nodeName:}" failed. No retries permitted until 2026-01-27 11:36:27.69682602 +0000 UTC m=+966.838423797 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift") pod "swift-storage-0" (UID: "b2f2b115-8dea-4dfa-a28e-5322f8fb8274") : configmap "swift-ring-files" not found Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.697777 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-scripts\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.698065 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aa44a018-6958-4bee-895d-e7ec3966be8d-etc-swift\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.699890 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-dispersionconf\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.702226 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-combined-ca-bundle\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.703801 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-swiftconf\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.721286 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th5jx\" (UniqueName: \"kubernetes.io/projected/aa44a018-6958-4bee-895d-e7ec3966be8d-kube-api-access-th5jx\") pod \"swift-ring-rebalance-7bdl6\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.832957 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.981952 4775 generic.go:334] "Generic (PLEG): container finished" podID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerID="c77bc3df0ef278fa6252111f5fae5b862f83e4845c5181bed73e9f84cf00a7b4" exitCode=0 Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.982033 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hp57" event={"ID":"f7ac68bf-cd99-4022-af50-a73ddc6181b0","Type":"ContainerDied","Data":"c77bc3df0ef278fa6252111f5fae5b862f83e4845c5181bed73e9f84cf00a7b4"} Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.987068 4775 generic.go:334] "Generic (PLEG): container finished" podID="62f5bc59-5fa8-42f4-bc7b-85827a01cc9d" containerID="3b69c86674facf450b3f60f67ef713811fbc5e3c9c84c0321b56c4b870189985" exitCode=0 Jan 27 11:36:26 crc kubenswrapper[4775]: I0127 11:36:26.987099 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m5645" event={"ID":"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d","Type":"ContainerDied","Data":"3b69c86674facf450b3f60f67ef713811fbc5e3c9c84c0321b56c4b870189985"} Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.143223 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.144046 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.208011 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.348116 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7bdl6"] Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.491537 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.611531 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-config\") pod \"de956838-03d3-41d8-96d3-a85293eff207\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.611706 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztkr9\" (UniqueName: \"kubernetes.io/projected/de956838-03d3-41d8-96d3-a85293eff207-kube-api-access-ztkr9\") pod \"de956838-03d3-41d8-96d3-a85293eff207\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.611778 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-dns-svc\") pod \"de956838-03d3-41d8-96d3-a85293eff207\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.611839 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-ovsdbserver-sb\") pod \"de956838-03d3-41d8-96d3-a85293eff207\" (UID: \"de956838-03d3-41d8-96d3-a85293eff207\") " Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.616795 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de956838-03d3-41d8-96d3-a85293eff207-kube-api-access-ztkr9" (OuterVolumeSpecName: "kube-api-access-ztkr9") pod "de956838-03d3-41d8-96d3-a85293eff207" (UID: "de956838-03d3-41d8-96d3-a85293eff207"). InnerVolumeSpecName "kube-api-access-ztkr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.655392 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-config" (OuterVolumeSpecName: "config") pod "de956838-03d3-41d8-96d3-a85293eff207" (UID: "de956838-03d3-41d8-96d3-a85293eff207"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.657802 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "de956838-03d3-41d8-96d3-a85293eff207" (UID: "de956838-03d3-41d8-96d3-a85293eff207"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.666128 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "de956838-03d3-41d8-96d3-a85293eff207" (UID: "de956838-03d3-41d8-96d3-a85293eff207"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.669891 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.712910 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-utilities\") pod \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713085 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jjzm\" (UniqueName: \"kubernetes.io/projected/f7ac68bf-cd99-4022-af50-a73ddc6181b0-kube-api-access-9jjzm\") pod \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713154 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-catalog-content\") pod \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\" (UID: \"f7ac68bf-cd99-4022-af50-a73ddc6181b0\") " Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713337 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713527 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713539 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713548 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztkr9\" (UniqueName: \"kubernetes.io/projected/de956838-03d3-41d8-96d3-a85293eff207-kube-api-access-ztkr9\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713558 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/de956838-03d3-41d8-96d3-a85293eff207-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.713573 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-utilities" (OuterVolumeSpecName: "utilities") pod "f7ac68bf-cd99-4022-af50-a73ddc6181b0" (UID: "f7ac68bf-cd99-4022-af50-a73ddc6181b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.713648 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.713658 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.713692 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift podName:b2f2b115-8dea-4dfa-a28e-5322f8fb8274 nodeName:}" failed. No retries permitted until 2026-01-27 11:36:29.713680721 +0000 UTC m=+968.855278498 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift") pod "swift-storage-0" (UID: "b2f2b115-8dea-4dfa-a28e-5322f8fb8274") : configmap "swift-ring-files" not found Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.734353 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7ac68bf-cd99-4022-af50-a73ddc6181b0-kube-api-access-9jjzm" (OuterVolumeSpecName: "kube-api-access-9jjzm") pod "f7ac68bf-cd99-4022-af50-a73ddc6181b0" (UID: "f7ac68bf-cd99-4022-af50-a73ddc6181b0"). InnerVolumeSpecName "kube-api-access-9jjzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.761612 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7ac68bf-cd99-4022-af50-a73ddc6181b0" (UID: "f7ac68bf-cd99-4022-af50-a73ddc6181b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808139 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-599fs"] Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.808550 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="extract-utilities" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808573 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="extract-utilities" Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.808593 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="extract-content" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808602 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="extract-content" Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.808625 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de956838-03d3-41d8-96d3-a85293eff207" containerName="dnsmasq-dns" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808634 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="de956838-03d3-41d8-96d3-a85293eff207" containerName="dnsmasq-dns" Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.808654 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de956838-03d3-41d8-96d3-a85293eff207" containerName="init" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808663 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="de956838-03d3-41d8-96d3-a85293eff207" containerName="init" Jan 27 11:36:27 crc kubenswrapper[4775]: E0127 11:36:27.808679 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="registry-server" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808687 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="registry-server" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808914 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="de956838-03d3-41d8-96d3-a85293eff207" containerName="dnsmasq-dns" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.808927 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" containerName="registry-server" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.809934 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-599fs" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.815264 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.815287 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7ac68bf-cd99-4022-af50-a73ddc6181b0-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.815300 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jjzm\" (UniqueName: \"kubernetes.io/projected/f7ac68bf-cd99-4022-af50-a73ddc6181b0-kube-api-access-9jjzm\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.817281 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-599fs"] Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.916594 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlxfh\" (UniqueName: \"kubernetes.io/projected/c3cd1d9e-b735-4f90-b92a-00353e576e10-kube-api-access-wlxfh\") pod \"glance-db-create-599fs\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " pod="openstack/glance-db-create-599fs" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.916734 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3cd1d9e-b735-4f90-b92a-00353e576e10-operator-scripts\") pod \"glance-db-create-599fs\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " pod="openstack/glance-db-create-599fs" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.931817 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9763-account-create-update-dms9b"] Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.933074 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.935199 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 27 11:36:27 crc kubenswrapper[4775]: I0127 11:36:27.959537 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9763-account-create-update-dms9b"] Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.001868 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" event={"ID":"de956838-03d3-41d8-96d3-a85293eff207","Type":"ContainerDied","Data":"fd5b11d9815172e4b8d84472d53d5f5c5a67656114e7d95c656d684b7f601224"} Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.001922 4775 scope.go:117] "RemoveContainer" containerID="be03cf8dd9b8b4d759b45f66de69e36302203158f35e88be1b7e33246324a38c" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.001929 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-794868bd45-kgvb6" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.003862 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7bdl6" event={"ID":"aa44a018-6958-4bee-895d-e7ec3966be8d","Type":"ContainerStarted","Data":"40d78acc3513c42656eeabd0301aca54c4b90d9da6dc67b6891b3be0547d67c8"} Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.010785 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hp57" event={"ID":"f7ac68bf-cd99-4022-af50-a73ddc6181b0","Type":"ContainerDied","Data":"d68aa08b8c10efd267dbb532a84a73914540135473560968b1351b3eea784ca0"} Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.010809 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hp57" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.019112 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3cd1d9e-b735-4f90-b92a-00353e576e10-operator-scripts\") pod \"glance-db-create-599fs\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " pod="openstack/glance-db-create-599fs" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.019156 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f577e755-a863-4fea-9288-6cd30168b405-operator-scripts\") pod \"glance-9763-account-create-update-dms9b\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.019233 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlxfh\" (UniqueName: \"kubernetes.io/projected/c3cd1d9e-b735-4f90-b92a-00353e576e10-kube-api-access-wlxfh\") pod \"glance-db-create-599fs\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " pod="openstack/glance-db-create-599fs" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.019301 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bccp8\" (UniqueName: \"kubernetes.io/projected/f577e755-a863-4fea-9288-6cd30168b405-kube-api-access-bccp8\") pod \"glance-9763-account-create-update-dms9b\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.021522 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3cd1d9e-b735-4f90-b92a-00353e576e10-operator-scripts\") pod \"glance-db-create-599fs\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " pod="openstack/glance-db-create-599fs" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.022107 4775 generic.go:334] "Generic (PLEG): container finished" podID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerID="b5dc76210b8840ce4aa3ed6531d8e2c91e46aaffef6ddac900a9922372f2a92b" exitCode=0 Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.022469 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" event={"ID":"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8","Type":"ContainerDied","Data":"b5dc76210b8840ce4aa3ed6531d8e2c91e46aaffef6ddac900a9922372f2a92b"} Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.040664 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlxfh\" (UniqueName: \"kubernetes.io/projected/c3cd1d9e-b735-4f90-b92a-00353e576e10-kube-api-access-wlxfh\") pod \"glance-db-create-599fs\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " pod="openstack/glance-db-create-599fs" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.057165 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kgvb6"] Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.078300 4775 scope.go:117] "RemoveContainer" containerID="2e81b0c9e712aea129261264566fcfb84e7c96b384aef515c9ba092ed6df8a8f" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.078442 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-794868bd45-kgvb6"] Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.094872 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.106584 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hp57"] Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.117148 4775 scope.go:117] "RemoveContainer" containerID="c77bc3df0ef278fa6252111f5fae5b862f83e4845c5181bed73e9f84cf00a7b4" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.117580 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2hp57"] Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.121480 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bccp8\" (UniqueName: \"kubernetes.io/projected/f577e755-a863-4fea-9288-6cd30168b405-kube-api-access-bccp8\") pod \"glance-9763-account-create-update-dms9b\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.121573 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f577e755-a863-4fea-9288-6cd30168b405-operator-scripts\") pod \"glance-9763-account-create-update-dms9b\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.122577 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f577e755-a863-4fea-9288-6cd30168b405-operator-scripts\") pod \"glance-9763-account-create-update-dms9b\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.125881 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-599fs" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.143193 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bccp8\" (UniqueName: \"kubernetes.io/projected/f577e755-a863-4fea-9288-6cd30168b405-kube-api-access-bccp8\") pod \"glance-9763-account-create-update-dms9b\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.184706 4775 scope.go:117] "RemoveContainer" containerID="862553ece09ec7abc1ec1a84f1cafbd9dd0b4ae450db1c4c095ba98bfbf00ead" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.212596 4775 scope.go:117] "RemoveContainer" containerID="a7078e3f79ebaf3e143c08c9b5d9ba3454399bc72621c179ec98e87d8ca953ac" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.256676 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.490287 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m5645" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.527552 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-operator-scripts\") pod \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.527818 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rfmh\" (UniqueName: \"kubernetes.io/projected/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-kube-api-access-5rfmh\") pod \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\" (UID: \"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d\") " Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.529444 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62f5bc59-5fa8-42f4-bc7b-85827a01cc9d" (UID: "62f5bc59-5fa8-42f4-bc7b-85827a01cc9d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.539530 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-kube-api-access-5rfmh" (OuterVolumeSpecName: "kube-api-access-5rfmh") pod "62f5bc59-5fa8-42f4-bc7b-85827a01cc9d" (UID: "62f5bc59-5fa8-42f4-bc7b-85827a01cc9d"). InnerVolumeSpecName "kube-api-access-5rfmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.629839 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.629884 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rfmh\" (UniqueName: \"kubernetes.io/projected/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d-kube-api-access-5rfmh\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.649147 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-599fs"] Jan 27 11:36:28 crc kubenswrapper[4775]: I0127 11:36:28.777664 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9763-account-create-update-dms9b"] Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.032622 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" event={"ID":"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8","Type":"ContainerStarted","Data":"5e752a3391827672fa37b60e71b5a6f3c1262d98795c1a40cb7662f381943f34"} Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.033707 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.036275 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-599fs" event={"ID":"c3cd1d9e-b735-4f90-b92a-00353e576e10","Type":"ContainerStarted","Data":"b726600d4c126579c1604f5195dde261fec3e367b813eba5f4b69473ff9e521c"} Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.036309 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-599fs" event={"ID":"c3cd1d9e-b735-4f90-b92a-00353e576e10","Type":"ContainerStarted","Data":"3b5ddd612ab93297e3e23fb56033a168ed9825de73d3fd9e685554b0fa0f4c04"} Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.039605 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9763-account-create-update-dms9b" event={"ID":"f577e755-a863-4fea-9288-6cd30168b405","Type":"ContainerStarted","Data":"1b501489d56c612c1213704c15f0b24ba5a096453c8a67466274eb0e4a0ced9d"} Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.039637 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9763-account-create-update-dms9b" event={"ID":"f577e755-a863-4fea-9288-6cd30168b405","Type":"ContainerStarted","Data":"01a9061ed3fa1746263b0d1d14017828bc7e0337d318aa6d508766ae75ad8327"} Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.041679 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-m5645" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.043067 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-m5645" event={"ID":"62f5bc59-5fa8-42f4-bc7b-85827a01cc9d","Type":"ContainerDied","Data":"5923ba32098b5e082d1f4b2d5b1afb7d403212251b636753e0cd847905ffc64f"} Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.043179 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5923ba32098b5e082d1f4b2d5b1afb7d403212251b636753e0cd847905ffc64f" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.059749 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" podStartSLOduration=5.059729011 podStartE2EDuration="5.059729011s" podCreationTimestamp="2026-01-27 11:36:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:29.054016276 +0000 UTC m=+968.195614083" watchObservedRunningTime="2026-01-27 11:36:29.059729011 +0000 UTC m=+968.201326788" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.080026 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-599fs" podStartSLOduration=2.080009076 podStartE2EDuration="2.080009076s" podCreationTimestamp="2026-01-27 11:36:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:29.069614441 +0000 UTC m=+968.211212218" watchObservedRunningTime="2026-01-27 11:36:29.080009076 +0000 UTC m=+968.221606853" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.088869 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-9763-account-create-update-dms9b" podStartSLOduration=2.088851587 podStartE2EDuration="2.088851587s" podCreationTimestamp="2026-01-27 11:36:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:29.086552634 +0000 UTC m=+968.228150411" watchObservedRunningTime="2026-01-27 11:36:29.088851587 +0000 UTC m=+968.230449354" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.521080 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.521191 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.521249 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.522214 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d3e646652935035e4ff54edd9c0e89ba4aba219ed8931315dc5dc4069b80f310"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.522292 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://d3e646652935035e4ff54edd9c0e89ba4aba219ed8931315dc5dc4069b80f310" gracePeriod=600 Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.659359 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jz4kw"] Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.669883 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jz4kw"] Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.728123 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.728255 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.747301 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:29 crc kubenswrapper[4775]: E0127 11:36:29.747498 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 27 11:36:29 crc kubenswrapper[4775]: E0127 11:36:29.747512 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 27 11:36:29 crc kubenswrapper[4775]: E0127 11:36:29.747550 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift podName:b2f2b115-8dea-4dfa-a28e-5322f8fb8274 nodeName:}" failed. No retries permitted until 2026-01-27 11:36:33.747537465 +0000 UTC m=+972.889135242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift") pod "swift-storage-0" (UID: "b2f2b115-8dea-4dfa-a28e-5322f8fb8274") : configmap "swift-ring-files" not found Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.786650 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba97a22f-b5dd-4289-bb3b-39578c05f231" path="/var/lib/kubelet/pods/ba97a22f-b5dd-4289-bb3b-39578c05f231/volumes" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.787160 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de956838-03d3-41d8-96d3-a85293eff207" path="/var/lib/kubelet/pods/de956838-03d3-41d8-96d3-a85293eff207/volumes" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.787685 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7ac68bf-cd99-4022-af50-a73ddc6181b0" path="/var/lib/kubelet/pods/f7ac68bf-cd99-4022-af50-a73ddc6181b0/volumes" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.788849 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vvxg4"] Jan 27 11:36:29 crc kubenswrapper[4775]: E0127 11:36:29.789111 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62f5bc59-5fa8-42f4-bc7b-85827a01cc9d" containerName="mariadb-database-create" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.789126 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="62f5bc59-5fa8-42f4-bc7b-85827a01cc9d" containerName="mariadb-database-create" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.789288 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="62f5bc59-5fa8-42f4-bc7b-85827a01cc9d" containerName="mariadb-database-create" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.789783 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vvxg4"] Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.789860 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.791318 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.849011 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f208e1de-fc0e-4deb-a093-d27604b3931f-operator-scripts\") pod \"root-account-create-update-vvxg4\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.849202 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmbjq\" (UniqueName: \"kubernetes.io/projected/f208e1de-fc0e-4deb-a093-d27604b3931f-kube-api-access-cmbjq\") pod \"root-account-create-update-vvxg4\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.951212 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmbjq\" (UniqueName: \"kubernetes.io/projected/f208e1de-fc0e-4deb-a093-d27604b3931f-kube-api-access-cmbjq\") pod \"root-account-create-update-vvxg4\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.951417 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f208e1de-fc0e-4deb-a093-d27604b3931f-operator-scripts\") pod \"root-account-create-update-vvxg4\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.952200 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f208e1de-fc0e-4deb-a093-d27604b3931f-operator-scripts\") pod \"root-account-create-update-vvxg4\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:29 crc kubenswrapper[4775]: I0127 11:36:29.974530 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmbjq\" (UniqueName: \"kubernetes.io/projected/f208e1de-fc0e-4deb-a093-d27604b3931f-kube-api-access-cmbjq\") pod \"root-account-create-update-vvxg4\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:30 crc kubenswrapper[4775]: I0127 11:36:30.114023 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:30 crc kubenswrapper[4775]: I0127 11:36:30.199546 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8s5p8"] Jan 27 11:36:30 crc kubenswrapper[4775]: I0127 11:36:30.796880 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t2tfh" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="registry-server" probeResult="failure" output=< Jan 27 11:36:30 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 11:36:30 crc kubenswrapper[4775]: > Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.064302 4775 generic.go:334] "Generic (PLEG): container finished" podID="c3cd1d9e-b735-4f90-b92a-00353e576e10" containerID="b726600d4c126579c1604f5195dde261fec3e367b813eba5f4b69473ff9e521c" exitCode=0 Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.064495 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-599fs" event={"ID":"c3cd1d9e-b735-4f90-b92a-00353e576e10","Type":"ContainerDied","Data":"b726600d4c126579c1604f5195dde261fec3e367b813eba5f4b69473ff9e521c"} Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.071103 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="d3e646652935035e4ff54edd9c0e89ba4aba219ed8931315dc5dc4069b80f310" exitCode=0 Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.071311 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"d3e646652935035e4ff54edd9c0e89ba4aba219ed8931315dc5dc4069b80f310"} Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.071375 4775 scope.go:117] "RemoveContainer" containerID="2871a1c3582de4c70e2186866f517a9085c1741422622dc5d1e02969b09f93ad" Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.074409 4775 generic.go:334] "Generic (PLEG): container finished" podID="0bbde61d-aca8-4b36-8896-9c0db3e081be" containerID="25331384137e51f62cf5d50c569a969c7570079d48885c44122b0593afae0e9e" exitCode=0 Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.074609 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2856-account-create-update-zgmqw" event={"ID":"0bbde61d-aca8-4b36-8896-9c0db3e081be","Type":"ContainerDied","Data":"25331384137e51f62cf5d50c569a969c7570079d48885c44122b0593afae0e9e"} Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.075121 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8s5p8" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="registry-server" containerID="cri-o://980f0264e345cdfb3b6f590b6db854bc469b4b363ed97dbbe2f3f3ceba904a42" gracePeriod=2 Jan 27 11:36:31 crc kubenswrapper[4775]: I0127 11:36:31.806922 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.114964 4775 generic.go:334] "Generic (PLEG): container finished" podID="a6b48db0-1768-4940-9e42-0362374c7358" containerID="980f0264e345cdfb3b6f590b6db854bc469b4b363ed97dbbe2f3f3ceba904a42" exitCode=0 Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.115166 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s5p8" event={"ID":"a6b48db0-1768-4940-9e42-0362374c7358","Type":"ContainerDied","Data":"980f0264e345cdfb3b6f590b6db854bc469b4b363ed97dbbe2f3f3ceba904a42"} Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.117830 4775 generic.go:334] "Generic (PLEG): container finished" podID="f577e755-a863-4fea-9288-6cd30168b405" containerID="1b501489d56c612c1213704c15f0b24ba5a096453c8a67466274eb0e4a0ced9d" exitCode=0 Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.117876 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9763-account-create-update-dms9b" event={"ID":"f577e755-a863-4fea-9288-6cd30168b405","Type":"ContainerDied","Data":"1b501489d56c612c1213704c15f0b24ba5a096453c8a67466274eb0e4a0ced9d"} Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.120499 4775 generic.go:334] "Generic (PLEG): container finished" podID="24d04bb6-3007-42c5-9753-746a6eeb7d1c" containerID="a7104b478c78a88190582a427d9e420a454c991055e729bc5832a8bcf5f244d9" exitCode=0 Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.120678 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d1f-account-create-update-gbh56" event={"ID":"24d04bb6-3007-42c5-9753-746a6eeb7d1c","Type":"ContainerDied","Data":"a7104b478c78a88190582a427d9e420a454c991055e729bc5832a8bcf5f244d9"} Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.185886 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.291167 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7c9q\" (UniqueName: \"kubernetes.io/projected/a6b48db0-1768-4940-9e42-0362374c7358-kube-api-access-d7c9q\") pod \"a6b48db0-1768-4940-9e42-0362374c7358\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.291749 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-utilities\") pod \"a6b48db0-1768-4940-9e42-0362374c7358\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.291801 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-catalog-content\") pod \"a6b48db0-1768-4940-9e42-0362374c7358\" (UID: \"a6b48db0-1768-4940-9e42-0362374c7358\") " Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.292783 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-utilities" (OuterVolumeSpecName: "utilities") pod "a6b48db0-1768-4940-9e42-0362374c7358" (UID: "a6b48db0-1768-4940-9e42-0362374c7358"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.299221 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b48db0-1768-4940-9e42-0362374c7358-kube-api-access-d7c9q" (OuterVolumeSpecName: "kube-api-access-d7c9q") pod "a6b48db0-1768-4940-9e42-0362374c7358" (UID: "a6b48db0-1768-4940-9e42-0362374c7358"). InnerVolumeSpecName "kube-api-access-d7c9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.393673 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.393703 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7c9q\" (UniqueName: \"kubernetes.io/projected/a6b48db0-1768-4940-9e42-0362374c7358-kube-api-access-d7c9q\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.403128 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vvxg4"] Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.552980 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.581591 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-599fs" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.585151 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6b48db0-1768-4940-9e42-0362374c7358" (UID: "a6b48db0-1768-4940-9e42-0362374c7358"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.602043 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbde61d-aca8-4b36-8896-9c0db3e081be-operator-scripts\") pod \"0bbde61d-aca8-4b36-8896-9c0db3e081be\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.602391 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrcgx\" (UniqueName: \"kubernetes.io/projected/0bbde61d-aca8-4b36-8896-9c0db3e081be-kube-api-access-jrcgx\") pod \"0bbde61d-aca8-4b36-8896-9c0db3e081be\" (UID: \"0bbde61d-aca8-4b36-8896-9c0db3e081be\") " Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.602867 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6b48db0-1768-4940-9e42-0362374c7358-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.604247 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bbde61d-aca8-4b36-8896-9c0db3e081be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0bbde61d-aca8-4b36-8896-9c0db3e081be" (UID: "0bbde61d-aca8-4b36-8896-9c0db3e081be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.608696 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bbde61d-aca8-4b36-8896-9c0db3e081be-kube-api-access-jrcgx" (OuterVolumeSpecName: "kube-api-access-jrcgx") pod "0bbde61d-aca8-4b36-8896-9c0db3e081be" (UID: "0bbde61d-aca8-4b36-8896-9c0db3e081be"). InnerVolumeSpecName "kube-api-access-jrcgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.704498 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlxfh\" (UniqueName: \"kubernetes.io/projected/c3cd1d9e-b735-4f90-b92a-00353e576e10-kube-api-access-wlxfh\") pod \"c3cd1d9e-b735-4f90-b92a-00353e576e10\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.704558 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3cd1d9e-b735-4f90-b92a-00353e576e10-operator-scripts\") pod \"c3cd1d9e-b735-4f90-b92a-00353e576e10\" (UID: \"c3cd1d9e-b735-4f90-b92a-00353e576e10\") " Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.705002 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3cd1d9e-b735-4f90-b92a-00353e576e10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c3cd1d9e-b735-4f90-b92a-00353e576e10" (UID: "c3cd1d9e-b735-4f90-b92a-00353e576e10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.705141 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bbde61d-aca8-4b36-8896-9c0db3e081be-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.705159 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrcgx\" (UniqueName: \"kubernetes.io/projected/0bbde61d-aca8-4b36-8896-9c0db3e081be-kube-api-access-jrcgx\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.705171 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c3cd1d9e-b735-4f90-b92a-00353e576e10-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.707051 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3cd1d9e-b735-4f90-b92a-00353e576e10-kube-api-access-wlxfh" (OuterVolumeSpecName: "kube-api-access-wlxfh") pod "c3cd1d9e-b735-4f90-b92a-00353e576e10" (UID: "c3cd1d9e-b735-4f90-b92a-00353e576e10"). InnerVolumeSpecName "kube-api-access-wlxfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:32 crc kubenswrapper[4775]: I0127 11:36:32.807390 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlxfh\" (UniqueName: \"kubernetes.io/projected/c3cd1d9e-b735-4f90-b92a-00353e576e10-kube-api-access-wlxfh\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.128765 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"26ce088382cdfd012bc2388482c813f595be3264b04c0cc4340c1bcb667afde7"} Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.130163 4775 generic.go:334] "Generic (PLEG): container finished" podID="f208e1de-fc0e-4deb-a093-d27604b3931f" containerID="f260a904e6d20da11c12e2ef276cb0dd004088b3878643538e823bf35507b886" exitCode=0 Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.130223 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vvxg4" event={"ID":"f208e1de-fc0e-4deb-a093-d27604b3931f","Type":"ContainerDied","Data":"f260a904e6d20da11c12e2ef276cb0dd004088b3878643538e823bf35507b886"} Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.130249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vvxg4" event={"ID":"f208e1de-fc0e-4deb-a093-d27604b3931f","Type":"ContainerStarted","Data":"24da8498cc84cff4f9ef441ab42afb304d113914ccb69880de7715e226fe3433"} Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.132202 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s5p8" event={"ID":"a6b48db0-1768-4940-9e42-0362374c7358","Type":"ContainerDied","Data":"f2f98601e1fc4d2ec97f9e0c70c2dbd57bb16d6fdfa6d9ac4a20a475acb21242"} Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.132223 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s5p8" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.132234 4775 scope.go:117] "RemoveContainer" containerID="980f0264e345cdfb3b6f590b6db854bc469b4b363ed97dbbe2f3f3ceba904a42" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.136721 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-2856-account-create-update-zgmqw" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.136694 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-2856-account-create-update-zgmqw" event={"ID":"0bbde61d-aca8-4b36-8896-9c0db3e081be","Type":"ContainerDied","Data":"2f4f48c0c35388742479c887fb4079df26a0d51b18d1878461a20933bd575635"} Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.136910 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f4f48c0c35388742479c887fb4079df26a0d51b18d1878461a20933bd575635" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.146231 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-599fs" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.146233 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-599fs" event={"ID":"c3cd1d9e-b735-4f90-b92a-00353e576e10","Type":"ContainerDied","Data":"3b5ddd612ab93297e3e23fb56033a168ed9825de73d3fd9e685554b0fa0f4c04"} Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.146326 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b5ddd612ab93297e3e23fb56033a168ed9825de73d3fd9e685554b0fa0f4c04" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.148519 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7bdl6" event={"ID":"aa44a018-6958-4bee-895d-e7ec3966be8d","Type":"ContainerStarted","Data":"510f5ff2f8d44620fdee51bdb0166c2c4b4f86e61d274047b5401fdf6da98261"} Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.159830 4775 scope.go:117] "RemoveContainer" containerID="1f7c49ce837d6dbb266165ca63e898a0dc5b0872cf3564463905319d62ce7b1b" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.192024 4775 scope.go:117] "RemoveContainer" containerID="dfcc40044b419ee03e79042d3f7fccf98f28c41e9f9431de67dcc1968ec91051" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.196427 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-7bdl6" podStartSLOduration=2.597667803 podStartE2EDuration="7.192360134s" podCreationTimestamp="2026-01-27 11:36:26 +0000 UTC" firstStartedPulling="2026-01-27 11:36:27.366364256 +0000 UTC m=+966.507962033" lastFinishedPulling="2026-01-27 11:36:31.961056587 +0000 UTC m=+971.102654364" observedRunningTime="2026-01-27 11:36:33.186619347 +0000 UTC m=+972.328217124" watchObservedRunningTime="2026-01-27 11:36:33.192360134 +0000 UTC m=+972.333957911" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.214498 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8s5p8"] Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.228722 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8s5p8"] Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.535847 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.543639 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.625993 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d04bb6-3007-42c5-9753-746a6eeb7d1c-operator-scripts\") pod \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.626033 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f577e755-a863-4fea-9288-6cd30168b405-operator-scripts\") pod \"f577e755-a863-4fea-9288-6cd30168b405\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.626056 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66pxc\" (UniqueName: \"kubernetes.io/projected/24d04bb6-3007-42c5-9753-746a6eeb7d1c-kube-api-access-66pxc\") pod \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\" (UID: \"24d04bb6-3007-42c5-9753-746a6eeb7d1c\") " Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.626075 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bccp8\" (UniqueName: \"kubernetes.io/projected/f577e755-a863-4fea-9288-6cd30168b405-kube-api-access-bccp8\") pod \"f577e755-a863-4fea-9288-6cd30168b405\" (UID: \"f577e755-a863-4fea-9288-6cd30168b405\") " Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.626511 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24d04bb6-3007-42c5-9753-746a6eeb7d1c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "24d04bb6-3007-42c5-9753-746a6eeb7d1c" (UID: "24d04bb6-3007-42c5-9753-746a6eeb7d1c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.626658 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f577e755-a863-4fea-9288-6cd30168b405-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f577e755-a863-4fea-9288-6cd30168b405" (UID: "f577e755-a863-4fea-9288-6cd30168b405"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.635163 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f577e755-a863-4fea-9288-6cd30168b405-kube-api-access-bccp8" (OuterVolumeSpecName: "kube-api-access-bccp8") pod "f577e755-a863-4fea-9288-6cd30168b405" (UID: "f577e755-a863-4fea-9288-6cd30168b405"). InnerVolumeSpecName "kube-api-access-bccp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.635360 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24d04bb6-3007-42c5-9753-746a6eeb7d1c-kube-api-access-66pxc" (OuterVolumeSpecName: "kube-api-access-66pxc") pod "24d04bb6-3007-42c5-9753-746a6eeb7d1c" (UID: "24d04bb6-3007-42c5-9753-746a6eeb7d1c"). InnerVolumeSpecName "kube-api-access-66pxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.728228 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/24d04bb6-3007-42c5-9753-746a6eeb7d1c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.728263 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f577e755-a863-4fea-9288-6cd30168b405-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.728273 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66pxc\" (UniqueName: \"kubernetes.io/projected/24d04bb6-3007-42c5-9753-746a6eeb7d1c-kube-api-access-66pxc\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.728283 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bccp8\" (UniqueName: \"kubernetes.io/projected/f577e755-a863-4fea-9288-6cd30168b405-kube-api-access-bccp8\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.755213 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6b48db0-1768-4940-9e42-0362374c7358" path="/var/lib/kubelet/pods/a6b48db0-1768-4940-9e42-0362374c7358/volumes" Jan 27 11:36:33 crc kubenswrapper[4775]: I0127 11:36:33.829396 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:33 crc kubenswrapper[4775]: E0127 11:36:33.829645 4775 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Jan 27 11:36:33 crc kubenswrapper[4775]: E0127 11:36:33.829681 4775 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Jan 27 11:36:33 crc kubenswrapper[4775]: E0127 11:36:33.829758 4775 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift podName:b2f2b115-8dea-4dfa-a28e-5322f8fb8274 nodeName:}" failed. No retries permitted until 2026-01-27 11:36:41.8297336 +0000 UTC m=+980.971331377 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift") pod "swift-storage-0" (UID: "b2f2b115-8dea-4dfa-a28e-5322f8fb8274") : configmap "swift-ring-files" not found Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.158858 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9763-account-create-update-dms9b" event={"ID":"f577e755-a863-4fea-9288-6cd30168b405","Type":"ContainerDied","Data":"01a9061ed3fa1746263b0d1d14017828bc7e0337d318aa6d508766ae75ad8327"} Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.158917 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01a9061ed3fa1746263b0d1d14017828bc7e0337d318aa6d508766ae75ad8327" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.159071 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9763-account-create-update-dms9b" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.162222 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8d1f-account-create-update-gbh56" event={"ID":"24d04bb6-3007-42c5-9753-746a6eeb7d1c","Type":"ContainerDied","Data":"df80ece7ab0fd17c0d9c7e70ac47be4aea20f8011f1d38b81535074ba3cc4622"} Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.162264 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df80ece7ab0fd17c0d9c7e70ac47be4aea20f8011f1d38b81535074ba3cc4622" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.162312 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8d1f-account-create-update-gbh56" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.525819 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.641070 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmbjq\" (UniqueName: \"kubernetes.io/projected/f208e1de-fc0e-4deb-a093-d27604b3931f-kube-api-access-cmbjq\") pod \"f208e1de-fc0e-4deb-a093-d27604b3931f\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.641354 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f208e1de-fc0e-4deb-a093-d27604b3931f-operator-scripts\") pod \"f208e1de-fc0e-4deb-a093-d27604b3931f\" (UID: \"f208e1de-fc0e-4deb-a093-d27604b3931f\") " Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.642197 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f208e1de-fc0e-4deb-a093-d27604b3931f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f208e1de-fc0e-4deb-a093-d27604b3931f" (UID: "f208e1de-fc0e-4deb-a093-d27604b3931f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.646166 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f208e1de-fc0e-4deb-a093-d27604b3931f-kube-api-access-cmbjq" (OuterVolumeSpecName: "kube-api-access-cmbjq") pod "f208e1de-fc0e-4deb-a093-d27604b3931f" (UID: "f208e1de-fc0e-4deb-a093-d27604b3931f"). InnerVolumeSpecName "kube-api-access-cmbjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.745129 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f208e1de-fc0e-4deb-a093-d27604b3931f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:34 crc kubenswrapper[4775]: I0127 11:36:34.746129 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmbjq\" (UniqueName: \"kubernetes.io/projected/f208e1de-fc0e-4deb-a093-d27604b3931f-kube-api-access-cmbjq\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.133802 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.176801 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vvxg4" event={"ID":"f208e1de-fc0e-4deb-a093-d27604b3931f","Type":"ContainerDied","Data":"24da8498cc84cff4f9ef441ab42afb304d113914ccb69880de7715e226fe3433"} Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.176904 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24da8498cc84cff4f9ef441ab42afb304d113914ccb69880de7715e226fe3433" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.176834 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vvxg4" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.201520 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-t2sfn"] Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.201810 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" podUID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerName="dnsmasq-dns" containerID="cri-o://8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3" gracePeriod=10 Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.765084 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.870259 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-dns-svc\") pod \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.870385 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-sb\") pod \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.870584 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-nb\") pod \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.870634 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-config\") pod \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.870687 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5q2x\" (UniqueName: \"kubernetes.io/projected/31d3ee22-9b3b-46ac-b896-ba5c521e1753-kube-api-access-m5q2x\") pod \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\" (UID: \"31d3ee22-9b3b-46ac-b896-ba5c521e1753\") " Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.903252 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d3ee22-9b3b-46ac-b896-ba5c521e1753-kube-api-access-m5q2x" (OuterVolumeSpecName: "kube-api-access-m5q2x") pod "31d3ee22-9b3b-46ac-b896-ba5c521e1753" (UID: "31d3ee22-9b3b-46ac-b896-ba5c521e1753"). InnerVolumeSpecName "kube-api-access-m5q2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.911491 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-config" (OuterVolumeSpecName: "config") pod "31d3ee22-9b3b-46ac-b896-ba5c521e1753" (UID: "31d3ee22-9b3b-46ac-b896-ba5c521e1753"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.913981 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "31d3ee22-9b3b-46ac-b896-ba5c521e1753" (UID: "31d3ee22-9b3b-46ac-b896-ba5c521e1753"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.922672 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "31d3ee22-9b3b-46ac-b896-ba5c521e1753" (UID: "31d3ee22-9b3b-46ac-b896-ba5c521e1753"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.923745 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "31d3ee22-9b3b-46ac-b896-ba5c521e1753" (UID: "31d3ee22-9b3b-46ac-b896-ba5c521e1753"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.974416 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.974444 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.974472 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.974481 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31d3ee22-9b3b-46ac-b896-ba5c521e1753-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:35 crc kubenswrapper[4775]: I0127 11:36:35.974489 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5q2x\" (UniqueName: \"kubernetes.io/projected/31d3ee22-9b3b-46ac-b896-ba5c521e1753-kube-api-access-m5q2x\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.185538 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.185542 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" event={"ID":"31d3ee22-9b3b-46ac-b896-ba5c521e1753","Type":"ContainerDied","Data":"8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3"} Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.185623 4775 scope.go:117] "RemoveContainer" containerID="8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3" Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.187628 4775 generic.go:334] "Generic (PLEG): container finished" podID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerID="8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3" exitCode=0 Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.187742 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757dc6fff9-t2sfn" event={"ID":"31d3ee22-9b3b-46ac-b896-ba5c521e1753","Type":"ContainerDied","Data":"b21b311b130da3440a7a2e7074ea7f07554bbb6a824125778029cf4c67436a28"} Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.208020 4775 scope.go:117] "RemoveContainer" containerID="eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9" Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.232012 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-t2sfn"] Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.237731 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757dc6fff9-t2sfn"] Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.239050 4775 scope.go:117] "RemoveContainer" containerID="8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3" Jan 27 11:36:36 crc kubenswrapper[4775]: E0127 11:36:36.240532 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3\": container with ID starting with 8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3 not found: ID does not exist" containerID="8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3" Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.240564 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3"} err="failed to get container status \"8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3\": rpc error: code = NotFound desc = could not find container \"8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3\": container with ID starting with 8e92b2c7df75d712ddac5f2a343941209fc8834192a358ee127166d365a32fa3 not found: ID does not exist" Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.240588 4775 scope.go:117] "RemoveContainer" containerID="eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9" Jan 27 11:36:36 crc kubenswrapper[4775]: E0127 11:36:36.240938 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9\": container with ID starting with eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9 not found: ID does not exist" containerID="eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9" Jan 27 11:36:36 crc kubenswrapper[4775]: I0127 11:36:36.240958 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9"} err="failed to get container status \"eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9\": rpc error: code = NotFound desc = could not find container \"eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9\": container with ID starting with eaa16eda28f658167c39eafe60736c1f0250fd23d02e0edc3baf80c23ceb32c9 not found: ID does not exist" Jan 27 11:36:37 crc kubenswrapper[4775]: I0127 11:36:37.201945 4775 generic.go:334] "Generic (PLEG): container finished" podID="01ba029b-2296-4519-b6b1-04674355258f" containerID="74bb5b1c930971f4fe9c5d05e3295a42d673f050d9c75ec7b42c0aa8e59510ca" exitCode=0 Jan 27 11:36:37 crc kubenswrapper[4775]: I0127 11:36:37.202000 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01ba029b-2296-4519-b6b1-04674355258f","Type":"ContainerDied","Data":"74bb5b1c930971f4fe9c5d05e3295a42d673f050d9c75ec7b42c0aa8e59510ca"} Jan 27 11:36:37 crc kubenswrapper[4775]: I0127 11:36:37.754738 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" path="/var/lib/kubelet/pods/31d3ee22-9b3b-46ac-b896-ba5c521e1753/volumes" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.088980 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-sd44h"] Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089309 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerName="dnsmasq-dns" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089326 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerName="dnsmasq-dns" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089338 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerName="init" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089346 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerName="init" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089362 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f577e755-a863-4fea-9288-6cd30168b405" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089371 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f577e755-a863-4fea-9288-6cd30168b405" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089391 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="extract-content" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089402 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="extract-content" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089412 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3cd1d9e-b735-4f90-b92a-00353e576e10" containerName="mariadb-database-create" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089417 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3cd1d9e-b735-4f90-b92a-00353e576e10" containerName="mariadb-database-create" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089429 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24d04bb6-3007-42c5-9753-746a6eeb7d1c" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089435 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="24d04bb6-3007-42c5-9753-746a6eeb7d1c" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089444 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f208e1de-fc0e-4deb-a093-d27604b3931f" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089473 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f208e1de-fc0e-4deb-a093-d27604b3931f" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089486 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="extract-utilities" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089494 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="extract-utilities" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089508 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bbde61d-aca8-4b36-8896-9c0db3e081be" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089514 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bbde61d-aca8-4b36-8896-9c0db3e081be" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: E0127 11:36:38.089523 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="registry-server" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089529 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="registry-server" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089696 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="24d04bb6-3007-42c5-9753-746a6eeb7d1c" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089713 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d3ee22-9b3b-46ac-b896-ba5c521e1753" containerName="dnsmasq-dns" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089727 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f208e1de-fc0e-4deb-a093-d27604b3931f" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089737 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6b48db0-1768-4940-9e42-0362374c7358" containerName="registry-server" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089748 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bbde61d-aca8-4b36-8896-9c0db3e081be" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089765 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3cd1d9e-b735-4f90-b92a-00353e576e10" containerName="mariadb-database-create" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.089776 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f577e755-a863-4fea-9288-6cd30168b405" containerName="mariadb-account-create-update" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.090416 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.092057 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ghp7c" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.092258 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.103988 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sd44h"] Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.107118 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-combined-ca-bundle\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.107152 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgdvh\" (UniqueName: \"kubernetes.io/projected/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-kube-api-access-sgdvh\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.107218 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-db-sync-config-data\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.107245 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-config-data\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.209342 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-combined-ca-bundle\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.209400 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgdvh\" (UniqueName: \"kubernetes.io/projected/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-kube-api-access-sgdvh\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.209592 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-db-sync-config-data\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.209633 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-config-data\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.214785 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-db-sync-config-data\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.216323 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01ba029b-2296-4519-b6b1-04674355258f","Type":"ContainerStarted","Data":"0bbda45d64c3d5291022cfefd67ac29a65fcce1e708b8976ccb1047b144eacb1"} Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.216660 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.217974 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-config-data\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.235084 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-combined-ca-bundle\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.251704 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgdvh\" (UniqueName: \"kubernetes.io/projected/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-kube-api-access-sgdvh\") pod \"glance-db-sync-sd44h\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.406340 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sd44h" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.970235 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.787559383 podStartE2EDuration="1m0.97020816s" podCreationTimestamp="2026-01-27 11:35:38 +0000 UTC" firstStartedPulling="2026-01-27 11:35:40.159252431 +0000 UTC m=+919.300850198" lastFinishedPulling="2026-01-27 11:36:00.341901198 +0000 UTC m=+939.483498975" observedRunningTime="2026-01-27 11:36:38.248285043 +0000 UTC m=+977.389882830" watchObservedRunningTime="2026-01-27 11:36:38.97020816 +0000 UTC m=+978.111805957" Jan 27 11:36:38 crc kubenswrapper[4775]: I0127 11:36:38.976954 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-sd44h"] Jan 27 11:36:39 crc kubenswrapper[4775]: I0127 11:36:39.222068 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sd44h" event={"ID":"ca5aab7c-3b7a-4996-82f5-478d4100bb6c","Type":"ContainerStarted","Data":"6240242f7a09936cfd2e2c9ff20e6303a6fa610f8151f73cb6a49267032567b6"} Jan 27 11:36:39 crc kubenswrapper[4775]: I0127 11:36:39.223202 4775 generic.go:334] "Generic (PLEG): container finished" podID="aa44a018-6958-4bee-895d-e7ec3966be8d" containerID="510f5ff2f8d44620fdee51bdb0166c2c4b4f86e61d274047b5401fdf6da98261" exitCode=0 Jan 27 11:36:39 crc kubenswrapper[4775]: I0127 11:36:39.223478 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7bdl6" event={"ID":"aa44a018-6958-4bee-895d-e7ec3966be8d","Type":"ContainerDied","Data":"510f5ff2f8d44620fdee51bdb0166c2c4b4f86e61d274047b5401fdf6da98261"} Jan 27 11:36:39 crc kubenswrapper[4775]: I0127 11:36:39.773414 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:39 crc kubenswrapper[4775]: I0127 11:36:39.825905 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.008948 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t2tfh"] Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.548681 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.671532 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-dispersionconf\") pod \"aa44a018-6958-4bee-895d-e7ec3966be8d\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.671595 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aa44a018-6958-4bee-895d-e7ec3966be8d-etc-swift\") pod \"aa44a018-6958-4bee-895d-e7ec3966be8d\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.671696 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-swiftconf\") pod \"aa44a018-6958-4bee-895d-e7ec3966be8d\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.671726 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th5jx\" (UniqueName: \"kubernetes.io/projected/aa44a018-6958-4bee-895d-e7ec3966be8d-kube-api-access-th5jx\") pod \"aa44a018-6958-4bee-895d-e7ec3966be8d\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.671781 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-scripts\") pod \"aa44a018-6958-4bee-895d-e7ec3966be8d\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.671830 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-combined-ca-bundle\") pod \"aa44a018-6958-4bee-895d-e7ec3966be8d\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.671893 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-ring-data-devices\") pod \"aa44a018-6958-4bee-895d-e7ec3966be8d\" (UID: \"aa44a018-6958-4bee-895d-e7ec3966be8d\") " Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.672918 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "aa44a018-6958-4bee-895d-e7ec3966be8d" (UID: "aa44a018-6958-4bee-895d-e7ec3966be8d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.673558 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa44a018-6958-4bee-895d-e7ec3966be8d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "aa44a018-6958-4bee-895d-e7ec3966be8d" (UID: "aa44a018-6958-4bee-895d-e7ec3966be8d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.677404 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa44a018-6958-4bee-895d-e7ec3966be8d-kube-api-access-th5jx" (OuterVolumeSpecName: "kube-api-access-th5jx") pod "aa44a018-6958-4bee-895d-e7ec3966be8d" (UID: "aa44a018-6958-4bee-895d-e7ec3966be8d"). InnerVolumeSpecName "kube-api-access-th5jx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.693200 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-scripts" (OuterVolumeSpecName: "scripts") pod "aa44a018-6958-4bee-895d-e7ec3966be8d" (UID: "aa44a018-6958-4bee-895d-e7ec3966be8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.693967 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "aa44a018-6958-4bee-895d-e7ec3966be8d" (UID: "aa44a018-6958-4bee-895d-e7ec3966be8d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.696196 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa44a018-6958-4bee-895d-e7ec3966be8d" (UID: "aa44a018-6958-4bee-895d-e7ec3966be8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.708643 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "aa44a018-6958-4bee-895d-e7ec3966be8d" (UID: "aa44a018-6958-4bee-895d-e7ec3966be8d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.774347 4775 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.774404 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th5jx\" (UniqueName: \"kubernetes.io/projected/aa44a018-6958-4bee-895d-e7ec3966be8d-kube-api-access-th5jx\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.774487 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.774502 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.774514 4775 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aa44a018-6958-4bee-895d-e7ec3966be8d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.774525 4775 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aa44a018-6958-4bee-895d-e7ec3966be8d-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:40 crc kubenswrapper[4775]: I0127 11:36:40.774535 4775 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aa44a018-6958-4bee-895d-e7ec3966be8d-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.249679 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t2tfh" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="registry-server" containerID="cri-o://8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67" gracePeriod=2 Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.250064 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7bdl6" event={"ID":"aa44a018-6958-4bee-895d-e7ec3966be8d","Type":"ContainerDied","Data":"40d78acc3513c42656eeabd0301aca54c4b90d9da6dc67b6891b3be0547d67c8"} Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.250093 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40d78acc3513c42656eeabd0301aca54c4b90d9da6dc67b6891b3be0547d67c8" Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.250300 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7bdl6" Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.876629 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.892409 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.899248 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b2f2b115-8dea-4dfa-a28e-5322f8fb8274-etc-swift\") pod \"swift-storage-0\" (UID: \"b2f2b115-8dea-4dfa-a28e-5322f8fb8274\") " pod="openstack/swift-storage-0" Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.901352 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.994249 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cjdj\" (UniqueName: \"kubernetes.io/projected/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-kube-api-access-5cjdj\") pod \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.994432 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-catalog-content\") pod \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.994476 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-utilities\") pod \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\" (UID: \"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0\") " Jan 27 11:36:41 crc kubenswrapper[4775]: I0127 11:36:41.995955 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-utilities" (OuterVolumeSpecName: "utilities") pod "6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" (UID: "6330ccb9-6a5a-42d6-8c0f-b3c395b867a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.001218 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-kube-api-access-5cjdj" (OuterVolumeSpecName: "kube-api-access-5cjdj") pod "6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" (UID: "6330ccb9-6a5a-42d6-8c0f-b3c395b867a0"). InnerVolumeSpecName "kube-api-access-5cjdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.096987 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.097021 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cjdj\" (UniqueName: \"kubernetes.io/projected/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-kube-api-access-5cjdj\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.147042 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" (UID: "6330ccb9-6a5a-42d6-8c0f-b3c395b867a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.198080 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.262729 4775 generic.go:334] "Generic (PLEG): container finished" podID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerID="8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67" exitCode=0 Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.262795 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2tfh" event={"ID":"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0","Type":"ContainerDied","Data":"8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67"} Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.262822 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t2tfh" event={"ID":"6330ccb9-6a5a-42d6-8c0f-b3c395b867a0","Type":"ContainerDied","Data":"99f3348700fb94d370f10d24c119a738256680dc0ee1f38c4d297c9772b690ab"} Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.262862 4775 scope.go:117] "RemoveContainer" containerID="8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.263179 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t2tfh" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.285980 4775 scope.go:117] "RemoveContainer" containerID="96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.308795 4775 scope.go:117] "RemoveContainer" containerID="af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.340716 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t2tfh"] Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.340912 4775 scope.go:117] "RemoveContainer" containerID="8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67" Jan 27 11:36:42 crc kubenswrapper[4775]: E0127 11:36:42.341367 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67\": container with ID starting with 8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67 not found: ID does not exist" containerID="8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.341406 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67"} err="failed to get container status \"8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67\": rpc error: code = NotFound desc = could not find container \"8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67\": container with ID starting with 8e8c88c2c2f8bcca8587657ea73a67247118760c9f9fceab2d9ae7f4ffdd4d67 not found: ID does not exist" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.341430 4775 scope.go:117] "RemoveContainer" containerID="96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e" Jan 27 11:36:42 crc kubenswrapper[4775]: E0127 11:36:42.341840 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e\": container with ID starting with 96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e not found: ID does not exist" containerID="96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.341879 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e"} err="failed to get container status \"96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e\": rpc error: code = NotFound desc = could not find container \"96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e\": container with ID starting with 96cd00078532cf76f20c06d6ac961f37b77b07f7db1ad381c802bb2009efbc6e not found: ID does not exist" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.341899 4775 scope.go:117] "RemoveContainer" containerID="af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a" Jan 27 11:36:42 crc kubenswrapper[4775]: E0127 11:36:42.342245 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a\": container with ID starting with af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a not found: ID does not exist" containerID="af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.342305 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a"} err="failed to get container status \"af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a\": rpc error: code = NotFound desc = could not find container \"af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a\": container with ID starting with af5f47067c543e9f42636d69e4b7f2e31a80ca9689751b39215f8fa2a5781f7a not found: ID does not exist" Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.347557 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t2tfh"] Jan 27 11:36:42 crc kubenswrapper[4775]: I0127 11:36:42.482359 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Jan 27 11:36:42 crc kubenswrapper[4775]: W0127 11:36:42.486224 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2f2b115_8dea_4dfa_a28e_5322f8fb8274.slice/crio-34447a8520ad634dab42c94126b8d402f1fbd135af073a6fcdb5fc226f4e114b WatchSource:0}: Error finding container 34447a8520ad634dab42c94126b8d402f1fbd135af073a6fcdb5fc226f4e114b: Status 404 returned error can't find the container with id 34447a8520ad634dab42c94126b8d402f1fbd135af073a6fcdb5fc226f4e114b Jan 27 11:36:43 crc kubenswrapper[4775]: I0127 11:36:43.277651 4775 generic.go:334] "Generic (PLEG): container finished" podID="83263987-4e3c-4e95-9083-bb6a43f52410" containerID="235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55" exitCode=0 Jan 27 11:36:43 crc kubenswrapper[4775]: I0127 11:36:43.277721 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"83263987-4e3c-4e95-9083-bb6a43f52410","Type":"ContainerDied","Data":"235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55"} Jan 27 11:36:43 crc kubenswrapper[4775]: I0127 11:36:43.279662 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"34447a8520ad634dab42c94126b8d402f1fbd135af073a6fcdb5fc226f4e114b"} Jan 27 11:36:43 crc kubenswrapper[4775]: I0127 11:36:43.756423 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" path="/var/lib/kubelet/pods/6330ccb9-6a5a-42d6-8c0f-b3c395b867a0/volumes" Jan 27 11:36:43 crc kubenswrapper[4775]: I0127 11:36:43.964600 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-4hqln" podUID="cacc7142-a8d4-4607-adb7-0090fbd3024a" containerName="ovn-controller" probeResult="failure" output=< Jan 27 11:36:43 crc kubenswrapper[4775]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 27 11:36:43 crc kubenswrapper[4775]: > Jan 27 11:36:44 crc kubenswrapper[4775]: I0127 11:36:44.289121 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"83263987-4e3c-4e95-9083-bb6a43f52410","Type":"ContainerStarted","Data":"d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b"} Jan 27 11:36:44 crc kubenswrapper[4775]: I0127 11:36:44.289407 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:36:44 crc kubenswrapper[4775]: I0127 11:36:44.296522 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"2190ae12109e8e1dceb559f827413fd62ef1ea37bbce5e271b7ce01d48316f0c"} Jan 27 11:36:44 crc kubenswrapper[4775]: I0127 11:36:44.296577 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"53e60efb3d4da4f9c33a16ff79d7060d850dcc3d7dc90d35deb2f114cc11efec"} Jan 27 11:36:44 crc kubenswrapper[4775]: I0127 11:36:44.296590 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"929dca143c02dd69cb1cad1d202c4addf9831873b6d8f82600a6d97b8e48ecc2"} Jan 27 11:36:44 crc kubenswrapper[4775]: I0127 11:36:44.313460 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371970.54135 podStartE2EDuration="1m6.31342735s" podCreationTimestamp="2026-01-27 11:35:38 +0000 UTC" firstStartedPulling="2026-01-27 11:35:40.42103229 +0000 UTC m=+919.562630057" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:44.309750619 +0000 UTC m=+983.451348396" watchObservedRunningTime="2026-01-27 11:36:44.31342735 +0000 UTC m=+983.455025127" Jan 27 11:36:48 crc kubenswrapper[4775]: I0127 11:36:48.949153 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-4hqln" podUID="cacc7142-a8d4-4607-adb7-0090fbd3024a" containerName="ovn-controller" probeResult="failure" output=< Jan 27 11:36:48 crc kubenswrapper[4775]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 27 11:36:48 crc kubenswrapper[4775]: > Jan 27 11:36:48 crc kubenswrapper[4775]: I0127 11:36:48.957700 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:36:48 crc kubenswrapper[4775]: I0127 11:36:48.961323 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-l9blz" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.230046 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-4hqln-config-zx9sc"] Jan 27 11:36:49 crc kubenswrapper[4775]: E0127 11:36:49.230874 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="extract-utilities" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.230893 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="extract-utilities" Jan 27 11:36:49 crc kubenswrapper[4775]: E0127 11:36:49.230913 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="extract-content" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.230919 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="extract-content" Jan 27 11:36:49 crc kubenswrapper[4775]: E0127 11:36:49.230931 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa44a018-6958-4bee-895d-e7ec3966be8d" containerName="swift-ring-rebalance" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.230938 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa44a018-6958-4bee-895d-e7ec3966be8d" containerName="swift-ring-rebalance" Jan 27 11:36:49 crc kubenswrapper[4775]: E0127 11:36:49.230950 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="registry-server" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.230956 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="registry-server" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.231127 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6330ccb9-6a5a-42d6-8c0f-b3c395b867a0" containerName="registry-server" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.231139 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa44a018-6958-4bee-895d-e7ec3966be8d" containerName="swift-ring-rebalance" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.234027 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.235974 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.243294 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4hqln-config-zx9sc"] Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.328949 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.329008 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdg2s\" (UniqueName: \"kubernetes.io/projected/14af2799-fccb-4f03-99f2-356e53df0f68-kube-api-access-sdg2s\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.329045 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run-ovn\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.329260 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-scripts\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.329477 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-additional-scripts\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.329558 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-log-ovn\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430170 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-scripts\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430245 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-additional-scripts\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430279 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-log-ovn\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430318 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430336 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdg2s\" (UniqueName: \"kubernetes.io/projected/14af2799-fccb-4f03-99f2-356e53df0f68-kube-api-access-sdg2s\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430359 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run-ovn\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430672 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430728 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run-ovn\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.430747 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-log-ovn\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.431347 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-additional-scripts\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.432792 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-scripts\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.451297 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdg2s\" (UniqueName: \"kubernetes.io/projected/14af2799-fccb-4f03-99f2-356e53df0f68-kube-api-access-sdg2s\") pod \"ovn-controller-4hqln-config-zx9sc\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.549850 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.566708 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.921269 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-x8mb5"] Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.922923 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:49 crc kubenswrapper[4775]: I0127 11:36:49.957016 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-x8mb5"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.013809 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-62xpg"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.014824 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.031239 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a920-account-create-update-7gdg6"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.032270 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.034851 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.044395 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjwkj\" (UniqueName: \"kubernetes.io/projected/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-kube-api-access-rjwkj\") pod \"cinder-db-create-x8mb5\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.044469 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-operator-scripts\") pod \"cinder-db-create-x8mb5\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.061961 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-62xpg"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.080153 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a920-account-create-update-7gdg6"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.120412 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b4bd-account-create-update-lztz8"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.122329 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.124488 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.146652 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7mfm\" (UniqueName: \"kubernetes.io/projected/58a046ea-e8eb-40ed-a64d-b382e0a2f331-kube-api-access-m7mfm\") pod \"cinder-a920-account-create-update-7gdg6\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.146736 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7v85\" (UniqueName: \"kubernetes.io/projected/c495d390-f7ca-4867-b334-263c03f6b211-kube-api-access-x7v85\") pod \"barbican-db-create-62xpg\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.146819 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjwkj\" (UniqueName: \"kubernetes.io/projected/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-kube-api-access-rjwkj\") pod \"cinder-db-create-x8mb5\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.146848 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58a046ea-e8eb-40ed-a64d-b382e0a2f331-operator-scripts\") pod \"cinder-a920-account-create-update-7gdg6\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.146911 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-operator-scripts\") pod \"cinder-db-create-x8mb5\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.146941 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c495d390-f7ca-4867-b334-263c03f6b211-operator-scripts\") pod \"barbican-db-create-62xpg\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.147213 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b4bd-account-create-update-lztz8"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.148023 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-operator-scripts\") pod \"cinder-db-create-x8mb5\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.176209 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjwkj\" (UniqueName: \"kubernetes.io/projected/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-kube-api-access-rjwkj\") pod \"cinder-db-create-x8mb5\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.232881 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-kc6bw"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.237628 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.245956 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-btkr8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.246185 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.246387 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.246517 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.248514 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7mfm\" (UniqueName: \"kubernetes.io/projected/58a046ea-e8eb-40ed-a64d-b382e0a2f331-kube-api-access-m7mfm\") pod \"cinder-a920-account-create-update-7gdg6\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.248564 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/066d45f9-5f72-4b81-8166-0238863b8789-operator-scripts\") pod \"barbican-b4bd-account-create-update-lztz8\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.248601 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7v85\" (UniqueName: \"kubernetes.io/projected/c495d390-f7ca-4867-b334-263c03f6b211-kube-api-access-x7v85\") pod \"barbican-db-create-62xpg\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.248667 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58a046ea-e8eb-40ed-a64d-b382e0a2f331-operator-scripts\") pod \"cinder-a920-account-create-update-7gdg6\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.248716 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c495d390-f7ca-4867-b334-263c03f6b211-operator-scripts\") pod \"barbican-db-create-62xpg\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.248733 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xnvh\" (UniqueName: \"kubernetes.io/projected/066d45f9-5f72-4b81-8166-0238863b8789-kube-api-access-7xnvh\") pod \"barbican-b4bd-account-create-update-lztz8\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.249807 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58a046ea-e8eb-40ed-a64d-b382e0a2f331-operator-scripts\") pod \"cinder-a920-account-create-update-7gdg6\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.250282 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c495d390-f7ca-4867-b334-263c03f6b211-operator-scripts\") pod \"barbican-db-create-62xpg\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.252545 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kc6bw"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.264065 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.307407 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7mfm\" (UniqueName: \"kubernetes.io/projected/58a046ea-e8eb-40ed-a64d-b382e0a2f331-kube-api-access-m7mfm\") pod \"cinder-a920-account-create-update-7gdg6\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.314295 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7v85\" (UniqueName: \"kubernetes.io/projected/c495d390-f7ca-4867-b334-263c03f6b211-kube-api-access-x7v85\") pod \"barbican-db-create-62xpg\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.336034 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.350721 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-777qq\" (UniqueName: \"kubernetes.io/projected/71de6180-54da-4c3b-8aea-73a2ccfd936a-kube-api-access-777qq\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.350928 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-config-data\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.351013 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xnvh\" (UniqueName: \"kubernetes.io/projected/066d45f9-5f72-4b81-8166-0238863b8789-kube-api-access-7xnvh\") pod \"barbican-b4bd-account-create-update-lztz8\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.351176 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/066d45f9-5f72-4b81-8166-0238863b8789-operator-scripts\") pod \"barbican-b4bd-account-create-update-lztz8\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.351223 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-combined-ca-bundle\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.352640 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/066d45f9-5f72-4b81-8166-0238863b8789-operator-scripts\") pod \"barbican-b4bd-account-create-update-lztz8\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.360813 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.369086 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-fcvx2"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.370278 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.380004 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fcvx2"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.399163 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xnvh\" (UniqueName: \"kubernetes.io/projected/066d45f9-5f72-4b81-8166-0238863b8789-kube-api-access-7xnvh\") pod \"barbican-b4bd-account-create-update-lztz8\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.439828 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.449073 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b21b-account-create-update-grvbp"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.452560 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-777qq\" (UniqueName: \"kubernetes.io/projected/71de6180-54da-4c3b-8aea-73a2ccfd936a-kube-api-access-777qq\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.452646 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-config-data\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.452716 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z75v8\" (UniqueName: \"kubernetes.io/projected/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-kube-api-access-z75v8\") pod \"neutron-db-create-fcvx2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.452765 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-combined-ca-bundle\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.452806 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-operator-scripts\") pod \"neutron-db-create-fcvx2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.452923 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.455862 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.456683 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-combined-ca-bundle\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.456878 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-config-data\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.458629 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b21b-account-create-update-grvbp"] Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.474132 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-777qq\" (UniqueName: \"kubernetes.io/projected/71de6180-54da-4c3b-8aea-73a2ccfd936a-kube-api-access-777qq\") pod \"keystone-db-sync-kc6bw\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.554861 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z75v8\" (UniqueName: \"kubernetes.io/projected/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-kube-api-access-z75v8\") pod \"neutron-db-create-fcvx2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.555206 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90455f95-bcc6-4229-948c-599c91a08b2a-operator-scripts\") pod \"neutron-b21b-account-create-update-grvbp\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.555290 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wmmm\" (UniqueName: \"kubernetes.io/projected/90455f95-bcc6-4229-948c-599c91a08b2a-kube-api-access-2wmmm\") pod \"neutron-b21b-account-create-update-grvbp\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.555340 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-operator-scripts\") pod \"neutron-db-create-fcvx2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.555997 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-operator-scripts\") pod \"neutron-db-create-fcvx2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.570187 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z75v8\" (UniqueName: \"kubernetes.io/projected/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-kube-api-access-z75v8\") pod \"neutron-db-create-fcvx2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.656658 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90455f95-bcc6-4229-948c-599c91a08b2a-operator-scripts\") pod \"neutron-b21b-account-create-update-grvbp\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.656713 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wmmm\" (UniqueName: \"kubernetes.io/projected/90455f95-bcc6-4229-948c-599c91a08b2a-kube-api-access-2wmmm\") pod \"neutron-b21b-account-create-update-grvbp\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.657577 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90455f95-bcc6-4229-948c-599c91a08b2a-operator-scripts\") pod \"neutron-b21b-account-create-update-grvbp\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.674300 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wmmm\" (UniqueName: \"kubernetes.io/projected/90455f95-bcc6-4229-948c-599c91a08b2a-kube-api-access-2wmmm\") pod \"neutron-b21b-account-create-update-grvbp\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.702248 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.730257 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:50 crc kubenswrapper[4775]: I0127 11:36:50.836014 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:53 crc kubenswrapper[4775]: E0127 11:36:53.829888 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:e4aa4ebbb1e581a12040e9ad2ae2709ac31b5d965bb64fc4252d1028b05c565f" Jan 27 11:36:53 crc kubenswrapper[4775]: E0127 11:36:53.831353 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:e4aa4ebbb1e581a12040e9ad2ae2709ac31b5d965bb64fc4252d1028b05c565f,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sgdvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-sd44h_openstack(ca5aab7c-3b7a-4996-82f5-478d4100bb6c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:36:53 crc kubenswrapper[4775]: E0127 11:36:53.832769 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-sd44h" podUID="ca5aab7c-3b7a-4996-82f5-478d4100bb6c" Jan 27 11:36:53 crc kubenswrapper[4775]: I0127 11:36:53.967792 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-4hqln" podUID="cacc7142-a8d4-4607-adb7-0090fbd3024a" containerName="ovn-controller" probeResult="failure" output=< Jan 27 11:36:53 crc kubenswrapper[4775]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 27 11:36:53 crc kubenswrapper[4775]: > Jan 27 11:36:54 crc kubenswrapper[4775]: W0127 11:36:54.190913 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7d7f9ca_2e9c_4379_bee2_38cf61ed6cb2.slice/crio-f564b76b65324e09d4e87879123fa8f55b7d4e8b86d0491590583b282fc26827 WatchSource:0}: Error finding container f564b76b65324e09d4e87879123fa8f55b7d4e8b86d0491590583b282fc26827: Status 404 returned error can't find the container with id f564b76b65324e09d4e87879123fa8f55b7d4e8b86d0491590583b282fc26827 Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.191687 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-fcvx2"] Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.370151 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fcvx2" event={"ID":"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2","Type":"ContainerStarted","Data":"ac331de51381335c4691ae4e98de7332a3c5743a5d6c666d5f05ad5b3c6fd004"} Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.370191 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fcvx2" event={"ID":"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2","Type":"ContainerStarted","Data":"f564b76b65324e09d4e87879123fa8f55b7d4e8b86d0491590583b282fc26827"} Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.374423 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"35b31caed1f9b488b656fc0047e0668077c6f405a3d048c168e07f94d8f89241"} Jan 27 11:36:54 crc kubenswrapper[4775]: E0127 11:36:54.377973 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:e4aa4ebbb1e581a12040e9ad2ae2709ac31b5d965bb64fc4252d1028b05c565f\\\"\"" pod="openstack/glance-db-sync-sd44h" podUID="ca5aab7c-3b7a-4996-82f5-478d4100bb6c" Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.400258 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-fcvx2" podStartSLOduration=4.400242807 podStartE2EDuration="4.400242807s" podCreationTimestamp="2026-01-27 11:36:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:36:54.398917471 +0000 UTC m=+993.540515248" watchObservedRunningTime="2026-01-27 11:36:54.400242807 +0000 UTC m=+993.541840584" Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.560405 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a920-account-create-update-7gdg6"] Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.572132 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b4bd-account-create-update-lztz8"] Jan 27 11:36:54 crc kubenswrapper[4775]: W0127 11:36:54.582962 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58a046ea_e8eb_40ed_a64d_b382e0a2f331.slice/crio-645ec6bc16ac00e7d54aa6d239acefd093531e831232ba99e89dbf6d89597b31 WatchSource:0}: Error finding container 645ec6bc16ac00e7d54aa6d239acefd093531e831232ba99e89dbf6d89597b31: Status 404 returned error can't find the container with id 645ec6bc16ac00e7d54aa6d239acefd093531e831232ba99e89dbf6d89597b31 Jan 27 11:36:54 crc kubenswrapper[4775]: W0127 11:36:54.590432 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod066d45f9_5f72_4b81_8166_0238863b8789.slice/crio-5cd6f7c3ddf66ceea30ede9c9ee55a0cd145c62a67d80fbe1199fbb04106349a WatchSource:0}: Error finding container 5cd6f7c3ddf66ceea30ede9c9ee55a0cd145c62a67d80fbe1199fbb04106349a: Status 404 returned error can't find the container with id 5cd6f7c3ddf66ceea30ede9c9ee55a0cd145c62a67d80fbe1199fbb04106349a Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.594387 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-62xpg"] Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.605511 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kc6bw"] Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.614006 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b21b-account-create-update-grvbp"] Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.624575 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-x8mb5"] Jan 27 11:36:54 crc kubenswrapper[4775]: I0127 11:36:54.633918 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-4hqln-config-zx9sc"] Jan 27 11:36:54 crc kubenswrapper[4775]: W0127 11:36:54.634965 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc495d390_f7ca_4867_b334_263c03f6b211.slice/crio-2c903f17abf32c1d41bdbc79bda89f18ca1f27003373db9f397f4ad0e4970669 WatchSource:0}: Error finding container 2c903f17abf32c1d41bdbc79bda89f18ca1f27003373db9f397f4ad0e4970669: Status 404 returned error can't find the container with id 2c903f17abf32c1d41bdbc79bda89f18ca1f27003373db9f397f4ad0e4970669 Jan 27 11:36:55 crc kubenswrapper[4775]: E0127 11:36:55.147049 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc495d390_f7ca_4867_b334_263c03f6b211.slice/crio-60c3929eb191aa5a40f70277344a8ffb5cea8ddde6e12141b0847fb62fc4d0e9.scope\": RecentStats: unable to find data in memory cache]" Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.385255 4775 generic.go:334] "Generic (PLEG): container finished" podID="58a046ea-e8eb-40ed-a64d-b382e0a2f331" containerID="8e66e5156f741145dc91fb1f4f5c4dcef2ff5bbcecc942be3a86ad151ce0efd1" exitCode=0 Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.385343 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a920-account-create-update-7gdg6" event={"ID":"58a046ea-e8eb-40ed-a64d-b382e0a2f331","Type":"ContainerDied","Data":"8e66e5156f741145dc91fb1f4f5c4dcef2ff5bbcecc942be3a86ad151ce0efd1"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.385702 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a920-account-create-update-7gdg6" event={"ID":"58a046ea-e8eb-40ed-a64d-b382e0a2f331","Type":"ContainerStarted","Data":"645ec6bc16ac00e7d54aa6d239acefd093531e831232ba99e89dbf6d89597b31"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.388675 4775 generic.go:334] "Generic (PLEG): container finished" podID="c495d390-f7ca-4867-b334-263c03f6b211" containerID="60c3929eb191aa5a40f70277344a8ffb5cea8ddde6e12141b0847fb62fc4d0e9" exitCode=0 Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.388809 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-62xpg" event={"ID":"c495d390-f7ca-4867-b334-263c03f6b211","Type":"ContainerDied","Data":"60c3929eb191aa5a40f70277344a8ffb5cea8ddde6e12141b0847fb62fc4d0e9"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.388831 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-62xpg" event={"ID":"c495d390-f7ca-4867-b334-263c03f6b211","Type":"ContainerStarted","Data":"2c903f17abf32c1d41bdbc79bda89f18ca1f27003373db9f397f4ad0e4970669"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.391701 4775 generic.go:334] "Generic (PLEG): container finished" podID="90455f95-bcc6-4229-948c-599c91a08b2a" containerID="ba2616ca5d5b886e0ddfe23c893276ccb71fe9923291902da4fa96d4180b8ef5" exitCode=0 Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.391965 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b21b-account-create-update-grvbp" event={"ID":"90455f95-bcc6-4229-948c-599c91a08b2a","Type":"ContainerDied","Data":"ba2616ca5d5b886e0ddfe23c893276ccb71fe9923291902da4fa96d4180b8ef5"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.392039 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b21b-account-create-update-grvbp" event={"ID":"90455f95-bcc6-4229-948c-599c91a08b2a","Type":"ContainerStarted","Data":"2a2f9e2bed91c1c37a6cc52de3ebdb332b65eec57fbad353ec8207b09e08bf89"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.394090 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kc6bw" event={"ID":"71de6180-54da-4c3b-8aea-73a2ccfd936a","Type":"ContainerStarted","Data":"11caf690b8e6315d486b022511a24646b3a13ddeba5aaf5fcd9be6d3ffa4371e"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.395412 4775 generic.go:334] "Generic (PLEG): container finished" podID="a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2" containerID="ac331de51381335c4691ae4e98de7332a3c5743a5d6c666d5f05ad5b3c6fd004" exitCode=0 Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.395509 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fcvx2" event={"ID":"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2","Type":"ContainerDied","Data":"ac331de51381335c4691ae4e98de7332a3c5743a5d6c666d5f05ad5b3c6fd004"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.398146 4775 generic.go:334] "Generic (PLEG): container finished" podID="d8a7ac2f-36f7-49c5-96f9-6f8b19809b07" containerID="17411cc983dfc73db04ce363359c284ba977fc80d7b5112232e0f918ef68f140" exitCode=0 Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.398224 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-x8mb5" event={"ID":"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07","Type":"ContainerDied","Data":"17411cc983dfc73db04ce363359c284ba977fc80d7b5112232e0f918ef68f140"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.398247 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-x8mb5" event={"ID":"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07","Type":"ContainerStarted","Data":"0c10f26dd6cce5cd83ce525896aef63b1c2d771d55e56bce125b0b36c7b1b426"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.408023 4775 generic.go:334] "Generic (PLEG): container finished" podID="066d45f9-5f72-4b81-8166-0238863b8789" containerID="7c55ba28687b09e9f043ff5197811f82e94f5b15d3585bb9d84c0255945f85f2" exitCode=0 Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.408163 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b4bd-account-create-update-lztz8" event={"ID":"066d45f9-5f72-4b81-8166-0238863b8789","Type":"ContainerDied","Data":"7c55ba28687b09e9f043ff5197811f82e94f5b15d3585bb9d84c0255945f85f2"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.408214 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b4bd-account-create-update-lztz8" event={"ID":"066d45f9-5f72-4b81-8166-0238863b8789","Type":"ContainerStarted","Data":"5cd6f7c3ddf66ceea30ede9c9ee55a0cd145c62a67d80fbe1199fbb04106349a"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.411617 4775 generic.go:334] "Generic (PLEG): container finished" podID="14af2799-fccb-4f03-99f2-356e53df0f68" containerID="ae1cd59633ddddab66ae211c50fdfac95f828c364b9df14796c53c76293906ec" exitCode=0 Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.411660 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4hqln-config-zx9sc" event={"ID":"14af2799-fccb-4f03-99f2-356e53df0f68","Type":"ContainerDied","Data":"ae1cd59633ddddab66ae211c50fdfac95f828c364b9df14796c53c76293906ec"} Jan 27 11:36:55 crc kubenswrapper[4775]: I0127 11:36:55.411689 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4hqln-config-zx9sc" event={"ID":"14af2799-fccb-4f03-99f2-356e53df0f68","Type":"ContainerStarted","Data":"569bc48f314bcae5f8fec5dac1dd86102f455f3121b44f8b97ab460b30a9b972"} Jan 27 11:36:56 crc kubenswrapper[4775]: I0127 11:36:56.426691 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"98ff6ddda4279972141347d396b3579ff9da021538016933851421012d167dfd"} Jan 27 11:36:56 crc kubenswrapper[4775]: I0127 11:36:56.427129 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"88474d892484c7924d01a47d7c326f452626aedd6fa0336e92fb823f6345b6ff"} Jan 27 11:36:56 crc kubenswrapper[4775]: I0127 11:36:56.427154 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"87e5393a5725c003ea3f1e8b16b6a7ac22c48a6572934a4f3eec12a89ca32650"} Jan 27 11:36:56 crc kubenswrapper[4775]: I0127 11:36:56.427175 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"4ee5eac6fa8d96f2c1251a9fa7a79066ea6f17f3f9fc3ba122ed91c65297289f"} Jan 27 11:36:58 crc kubenswrapper[4775]: I0127 11:36:58.972015 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-4hqln" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.221377 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.230994 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.258696 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.266579 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.278210 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.285243 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.289159 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335130 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-log-ovn\") pod \"14af2799-fccb-4f03-99f2-356e53df0f68\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335553 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run\") pod \"14af2799-fccb-4f03-99f2-356e53df0f68\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335589 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7mfm\" (UniqueName: \"kubernetes.io/projected/58a046ea-e8eb-40ed-a64d-b382e0a2f331-kube-api-access-m7mfm\") pod \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335615 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-additional-scripts\") pod \"14af2799-fccb-4f03-99f2-356e53df0f68\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335644 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/066d45f9-5f72-4b81-8166-0238863b8789-operator-scripts\") pod \"066d45f9-5f72-4b81-8166-0238863b8789\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335672 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xnvh\" (UniqueName: \"kubernetes.io/projected/066d45f9-5f72-4b81-8166-0238863b8789-kube-api-access-7xnvh\") pod \"066d45f9-5f72-4b81-8166-0238863b8789\" (UID: \"066d45f9-5f72-4b81-8166-0238863b8789\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335697 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run-ovn\") pod \"14af2799-fccb-4f03-99f2-356e53df0f68\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335721 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdg2s\" (UniqueName: \"kubernetes.io/projected/14af2799-fccb-4f03-99f2-356e53df0f68-kube-api-access-sdg2s\") pod \"14af2799-fccb-4f03-99f2-356e53df0f68\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335740 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-scripts\") pod \"14af2799-fccb-4f03-99f2-356e53df0f68\" (UID: \"14af2799-fccb-4f03-99f2-356e53df0f68\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.335762 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58a046ea-e8eb-40ed-a64d-b382e0a2f331-operator-scripts\") pod \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\" (UID: \"58a046ea-e8eb-40ed-a64d-b382e0a2f331\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.336862 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58a046ea-e8eb-40ed-a64d-b382e0a2f331-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "58a046ea-e8eb-40ed-a64d-b382e0a2f331" (UID: "58a046ea-e8eb-40ed-a64d-b382e0a2f331"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.336903 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "14af2799-fccb-4f03-99f2-356e53df0f68" (UID: "14af2799-fccb-4f03-99f2-356e53df0f68"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.336921 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run" (OuterVolumeSpecName: "var-run") pod "14af2799-fccb-4f03-99f2-356e53df0f68" (UID: "14af2799-fccb-4f03-99f2-356e53df0f68"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.339466 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/066d45f9-5f72-4b81-8166-0238863b8789-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "066d45f9-5f72-4b81-8166-0238863b8789" (UID: "066d45f9-5f72-4b81-8166-0238863b8789"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.339522 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "14af2799-fccb-4f03-99f2-356e53df0f68" (UID: "14af2799-fccb-4f03-99f2-356e53df0f68"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.339952 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "14af2799-fccb-4f03-99f2-356e53df0f68" (UID: "14af2799-fccb-4f03-99f2-356e53df0f68"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.340124 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-scripts" (OuterVolumeSpecName: "scripts") pod "14af2799-fccb-4f03-99f2-356e53df0f68" (UID: "14af2799-fccb-4f03-99f2-356e53df0f68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.346438 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58a046ea-e8eb-40ed-a64d-b382e0a2f331-kube-api-access-m7mfm" (OuterVolumeSpecName: "kube-api-access-m7mfm") pod "58a046ea-e8eb-40ed-a64d-b382e0a2f331" (UID: "58a046ea-e8eb-40ed-a64d-b382e0a2f331"). InnerVolumeSpecName "kube-api-access-m7mfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.346515 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14af2799-fccb-4f03-99f2-356e53df0f68-kube-api-access-sdg2s" (OuterVolumeSpecName: "kube-api-access-sdg2s") pod "14af2799-fccb-4f03-99f2-356e53df0f68" (UID: "14af2799-fccb-4f03-99f2-356e53df0f68"). InnerVolumeSpecName "kube-api-access-sdg2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.346541 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/066d45f9-5f72-4b81-8166-0238863b8789-kube-api-access-7xnvh" (OuterVolumeSpecName: "kube-api-access-7xnvh") pod "066d45f9-5f72-4b81-8166-0238863b8789" (UID: "066d45f9-5f72-4b81-8166-0238863b8789"). InnerVolumeSpecName "kube-api-access-7xnvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.437467 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c495d390-f7ca-4867-b334-263c03f6b211-operator-scripts\") pod \"c495d390-f7ca-4867-b334-263c03f6b211\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.437559 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wmmm\" (UniqueName: \"kubernetes.io/projected/90455f95-bcc6-4229-948c-599c91a08b2a-kube-api-access-2wmmm\") pod \"90455f95-bcc6-4229-948c-599c91a08b2a\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.437940 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c495d390-f7ca-4867-b334-263c03f6b211-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c495d390-f7ca-4867-b334-263c03f6b211" (UID: "c495d390-f7ca-4867-b334-263c03f6b211"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438038 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-operator-scripts\") pod \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438108 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7v85\" (UniqueName: \"kubernetes.io/projected/c495d390-f7ca-4867-b334-263c03f6b211-kube-api-access-x7v85\") pod \"c495d390-f7ca-4867-b334-263c03f6b211\" (UID: \"c495d390-f7ca-4867-b334-263c03f6b211\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438215 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-operator-scripts\") pod \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438328 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90455f95-bcc6-4229-948c-599c91a08b2a-operator-scripts\") pod \"90455f95-bcc6-4229-948c-599c91a08b2a\" (UID: \"90455f95-bcc6-4229-948c-599c91a08b2a\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438353 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjwkj\" (UniqueName: \"kubernetes.io/projected/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-kube-api-access-rjwkj\") pod \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\" (UID: \"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438585 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2" (UID: "a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438633 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8a7ac2f-36f7-49c5-96f9-6f8b19809b07" (UID: "d8a7ac2f-36f7-49c5-96f9-6f8b19809b07"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438743 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90455f95-bcc6-4229-948c-599c91a08b2a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "90455f95-bcc6-4229-948c-599c91a08b2a" (UID: "90455f95-bcc6-4229-948c-599c91a08b2a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.438841 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z75v8\" (UniqueName: \"kubernetes.io/projected/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-kube-api-access-z75v8\") pod \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\" (UID: \"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2\") " Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439431 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c495d390-f7ca-4867-b334-263c03f6b211-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439468 4775 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439480 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7mfm\" (UniqueName: \"kubernetes.io/projected/58a046ea-e8eb-40ed-a64d-b382e0a2f331-kube-api-access-m7mfm\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439491 4775 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439502 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/066d45f9-5f72-4b81-8166-0238863b8789-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439510 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439521 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xnvh\" (UniqueName: \"kubernetes.io/projected/066d45f9-5f72-4b81-8166-0238863b8789-kube-api-access-7xnvh\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439529 4775 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439537 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdg2s\" (UniqueName: \"kubernetes.io/projected/14af2799-fccb-4f03-99f2-356e53df0f68-kube-api-access-sdg2s\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439546 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14af2799-fccb-4f03-99f2-356e53df0f68-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439554 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/58a046ea-e8eb-40ed-a64d-b382e0a2f331-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439563 4775 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/14af2799-fccb-4f03-99f2-356e53df0f68-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439572 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.439579 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90455f95-bcc6-4229-948c-599c91a08b2a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.440659 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90455f95-bcc6-4229-948c-599c91a08b2a-kube-api-access-2wmmm" (OuterVolumeSpecName: "kube-api-access-2wmmm") pod "90455f95-bcc6-4229-948c-599c91a08b2a" (UID: "90455f95-bcc6-4229-948c-599c91a08b2a"). InnerVolumeSpecName "kube-api-access-2wmmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.441183 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-kube-api-access-z75v8" (OuterVolumeSpecName: "kube-api-access-z75v8") pod "a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2" (UID: "a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2"). InnerVolumeSpecName "kube-api-access-z75v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.441688 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c495d390-f7ca-4867-b334-263c03f6b211-kube-api-access-x7v85" (OuterVolumeSpecName: "kube-api-access-x7v85") pod "c495d390-f7ca-4867-b334-263c03f6b211" (UID: "c495d390-f7ca-4867-b334-263c03f6b211"). InnerVolumeSpecName "kube-api-access-x7v85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.448100 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-kube-api-access-rjwkj" (OuterVolumeSpecName: "kube-api-access-rjwkj") pod "d8a7ac2f-36f7-49c5-96f9-6f8b19809b07" (UID: "d8a7ac2f-36f7-49c5-96f9-6f8b19809b07"). InnerVolumeSpecName "kube-api-access-rjwkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.449051 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-62xpg" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.449065 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-62xpg" event={"ID":"c495d390-f7ca-4867-b334-263c03f6b211","Type":"ContainerDied","Data":"2c903f17abf32c1d41bdbc79bda89f18ca1f27003373db9f397f4ad0e4970669"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.449099 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c903f17abf32c1d41bdbc79bda89f18ca1f27003373db9f397f4ad0e4970669" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.457221 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b21b-account-create-update-grvbp" event={"ID":"90455f95-bcc6-4229-948c-599c91a08b2a","Type":"ContainerDied","Data":"2a2f9e2bed91c1c37a6cc52de3ebdb332b65eec57fbad353ec8207b09e08bf89"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.457278 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a2f9e2bed91c1c37a6cc52de3ebdb332b65eec57fbad353ec8207b09e08bf89" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.457422 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b21b-account-create-update-grvbp" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.460883 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kc6bw" event={"ID":"71de6180-54da-4c3b-8aea-73a2ccfd936a","Type":"ContainerStarted","Data":"adceaeb3830c50c53d8853f905ea7baa2cdfc916451d3151ad053ea8bc41ca42"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.462685 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-fcvx2" event={"ID":"a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2","Type":"ContainerDied","Data":"f564b76b65324e09d4e87879123fa8f55b7d4e8b86d0491590583b282fc26827"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.462756 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f564b76b65324e09d4e87879123fa8f55b7d4e8b86d0491590583b282fc26827" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.462820 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-fcvx2" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.481053 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-x8mb5" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.481397 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-x8mb5" event={"ID":"d8a7ac2f-36f7-49c5-96f9-6f8b19809b07","Type":"ContainerDied","Data":"0c10f26dd6cce5cd83ce525896aef63b1c2d771d55e56bce125b0b36c7b1b426"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.481481 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c10f26dd6cce5cd83ce525896aef63b1c2d771d55e56bce125b0b36c7b1b426" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.510210 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b4bd-account-create-update-lztz8" event={"ID":"066d45f9-5f72-4b81-8166-0238863b8789","Type":"ContainerDied","Data":"5cd6f7c3ddf66ceea30ede9c9ee55a0cd145c62a67d80fbe1199fbb04106349a"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.510251 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cd6f7c3ddf66ceea30ede9c9ee55a0cd145c62a67d80fbe1199fbb04106349a" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.510301 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b4bd-account-create-update-lztz8" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.513616 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-4hqln-config-zx9sc" event={"ID":"14af2799-fccb-4f03-99f2-356e53df0f68","Type":"ContainerDied","Data":"569bc48f314bcae5f8fec5dac1dd86102f455f3121b44f8b97ab460b30a9b972"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.513661 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="569bc48f314bcae5f8fec5dac1dd86102f455f3121b44f8b97ab460b30a9b972" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.513721 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-4hqln-config-zx9sc" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.535692 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a920-account-create-update-7gdg6" event={"ID":"58a046ea-e8eb-40ed-a64d-b382e0a2f331","Type":"ContainerDied","Data":"645ec6bc16ac00e7d54aa6d239acefd093531e831232ba99e89dbf6d89597b31"} Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.535731 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="645ec6bc16ac00e7d54aa6d239acefd093531e831232ba99e89dbf6d89597b31" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.535791 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a920-account-create-update-7gdg6" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.541809 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjwkj\" (UniqueName: \"kubernetes.io/projected/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07-kube-api-access-rjwkj\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.541844 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z75v8\" (UniqueName: \"kubernetes.io/projected/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2-kube-api-access-z75v8\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.541858 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wmmm\" (UniqueName: \"kubernetes.io/projected/90455f95-bcc6-4229-948c-599c91a08b2a-kube-api-access-2wmmm\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.541870 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7v85\" (UniqueName: \"kubernetes.io/projected/c495d390-f7ca-4867-b334-263c03f6b211-kube-api-access-x7v85\") on node \"crc\" DevicePath \"\"" Jan 27 11:36:59 crc kubenswrapper[4775]: I0127 11:36:59.857630 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:37:00 crc kubenswrapper[4775]: I0127 11:37:00.125112 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-kc6bw" podStartSLOduration=5.673631258 podStartE2EDuration="10.125093772s" podCreationTimestamp="2026-01-27 11:36:50 +0000 UTC" firstStartedPulling="2026-01-27 11:36:54.630664856 +0000 UTC m=+993.772262633" lastFinishedPulling="2026-01-27 11:36:59.08212737 +0000 UTC m=+998.223725147" observedRunningTime="2026-01-27 11:36:59.49196689 +0000 UTC m=+998.633564687" watchObservedRunningTime="2026-01-27 11:37:00.125093772 +0000 UTC m=+999.266691549" Jan 27 11:37:00 crc kubenswrapper[4775]: I0127 11:37:00.396163 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-4hqln-config-zx9sc"] Jan 27 11:37:00 crc kubenswrapper[4775]: I0127 11:37:00.420751 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-4hqln-config-zx9sc"] Jan 27 11:37:00 crc kubenswrapper[4775]: I0127 11:37:00.549627 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"e07502f26c911250d5b66b3441141cc60742d2d1456821147604fadd202668c5"} Jan 27 11:37:00 crc kubenswrapper[4775]: I0127 11:37:00.549678 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"da5d969dc14b57f0d19726f5eeac3e68a8a4451ad144cbddedd792bd17c2286f"} Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.560820 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"3d026310fb65835934c92d3f872a5ae72a03a0f21fd058e0432dfc895d788fea"} Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.561178 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"be6a6987d8adde43d996c27c63c7030dd8388653333f1b5cb987b62e05e6df90"} Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.561196 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"7fba88c01e837c306efea529653ad15f8560adfc196a2d66c87819e5f149678e"} Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.561207 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"69422f80f6d8360d5ef3b9b108470f8bf2b7a2e3fd1188370efb001109582875"} Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.561221 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"b2f2b115-8dea-4dfa-a28e-5322f8fb8274","Type":"ContainerStarted","Data":"785fbff6082f4b621a054d1c6877ee393ea27dbf500b1f5df0edb5a7ed8cffb6"} Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.612114 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=19.931448964 podStartE2EDuration="37.612100201s" podCreationTimestamp="2026-01-27 11:36:24 +0000 UTC" firstStartedPulling="2026-01-27 11:36:42.48871662 +0000 UTC m=+981.630314397" lastFinishedPulling="2026-01-27 11:37:00.169367857 +0000 UTC m=+999.310965634" observedRunningTime="2026-01-27 11:37:01.6083854 +0000 UTC m=+1000.749983177" watchObservedRunningTime="2026-01-27 11:37:01.612100201 +0000 UTC m=+1000.753697978" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.756166 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14af2799-fccb-4f03-99f2-356e53df0f68" path="/var/lib/kubelet/pods/14af2799-fccb-4f03-99f2-356e53df0f68/volumes" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.899462 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-kbm75"] Jan 27 11:37:01 crc kubenswrapper[4775]: E0127 11:37:01.899872 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066d45f9-5f72-4b81-8166-0238863b8789" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.899899 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="066d45f9-5f72-4b81-8166-0238863b8789" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: E0127 11:37:01.899932 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90455f95-bcc6-4229-948c-599c91a08b2a" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.899945 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="90455f95-bcc6-4229-948c-599c91a08b2a" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: E0127 11:37:01.900153 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58a046ea-e8eb-40ed-a64d-b382e0a2f331" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900165 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a046ea-e8eb-40ed-a64d-b382e0a2f331" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: E0127 11:37:01.900184 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900194 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: E0127 11:37:01.900214 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a7ac2f-36f7-49c5-96f9-6f8b19809b07" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900223 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a7ac2f-36f7-49c5-96f9-6f8b19809b07" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: E0127 11:37:01.900233 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c495d390-f7ca-4867-b334-263c03f6b211" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900243 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c495d390-f7ca-4867-b334-263c03f6b211" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: E0127 11:37:01.900263 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14af2799-fccb-4f03-99f2-356e53df0f68" containerName="ovn-config" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900273 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="14af2799-fccb-4f03-99f2-356e53df0f68" containerName="ovn-config" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900501 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="58a046ea-e8eb-40ed-a64d-b382e0a2f331" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900534 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="90455f95-bcc6-4229-948c-599c91a08b2a" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900549 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c495d390-f7ca-4867-b334-263c03f6b211" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900559 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a7ac2f-36f7-49c5-96f9-6f8b19809b07" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900574 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2" containerName="mariadb-database-create" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900590 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="066d45f9-5f72-4b81-8166-0238863b8789" containerName="mariadb-account-create-update" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.900603 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="14af2799-fccb-4f03-99f2-356e53df0f68" containerName="ovn-config" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.901621 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.903528 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.912314 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-kbm75"] Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.985518 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.985645 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.985710 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-config\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.985762 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.985811 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wtgz\" (UniqueName: \"kubernetes.io/projected/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-kube-api-access-6wtgz\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:01 crc kubenswrapper[4775]: I0127 11:37:01.985849 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.087555 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wtgz\" (UniqueName: \"kubernetes.io/projected/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-kube-api-access-6wtgz\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.087631 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.087687 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.087739 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.087795 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-config\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.087850 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.089089 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-svc\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.089192 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-nb\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.089210 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-config\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.089274 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-swift-storage-0\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.089110 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-sb\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.104628 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wtgz\" (UniqueName: \"kubernetes.io/projected/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-kube-api-access-6wtgz\") pod \"dnsmasq-dns-8467b54bcc-kbm75\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.226790 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:02 crc kubenswrapper[4775]: I0127 11:37:02.652497 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-kbm75"] Jan 27 11:37:03 crc kubenswrapper[4775]: I0127 11:37:03.583965 4775 generic.go:334] "Generic (PLEG): container finished" podID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerID="e1beaaa4695018ded22e4c66ef6c8ed9e50da3ff9c5013e8aa00be310e0383e9" exitCode=0 Jan 27 11:37:03 crc kubenswrapper[4775]: I0127 11:37:03.584025 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" event={"ID":"77bba6d5-b2fc-4cb1-a104-f61fb146ae66","Type":"ContainerDied","Data":"e1beaaa4695018ded22e4c66ef6c8ed9e50da3ff9c5013e8aa00be310e0383e9"} Jan 27 11:37:03 crc kubenswrapper[4775]: I0127 11:37:03.584414 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" event={"ID":"77bba6d5-b2fc-4cb1-a104-f61fb146ae66","Type":"ContainerStarted","Data":"f0234531f1a183116689c44e0aec6118543c01688ecd27a430619636f971edf6"} Jan 27 11:37:03 crc kubenswrapper[4775]: I0127 11:37:03.586351 4775 generic.go:334] "Generic (PLEG): container finished" podID="71de6180-54da-4c3b-8aea-73a2ccfd936a" containerID="adceaeb3830c50c53d8853f905ea7baa2cdfc916451d3151ad053ea8bc41ca42" exitCode=0 Jan 27 11:37:03 crc kubenswrapper[4775]: I0127 11:37:03.586384 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kc6bw" event={"ID":"71de6180-54da-4c3b-8aea-73a2ccfd936a","Type":"ContainerDied","Data":"adceaeb3830c50c53d8853f905ea7baa2cdfc916451d3151ad053ea8bc41ca42"} Jan 27 11:37:04 crc kubenswrapper[4775]: I0127 11:37:04.599917 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" event={"ID":"77bba6d5-b2fc-4cb1-a104-f61fb146ae66","Type":"ContainerStarted","Data":"0a1ee926e84e3e35b872d88cb08d96d4de13299dd229ad87fd274d8070f6832d"} Jan 27 11:37:04 crc kubenswrapper[4775]: I0127 11:37:04.625515 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" podStartSLOduration=3.6254439979999997 podStartE2EDuration="3.625443998s" podCreationTimestamp="2026-01-27 11:37:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:04.619976468 +0000 UTC m=+1003.761574285" watchObservedRunningTime="2026-01-27 11:37:04.625443998 +0000 UTC m=+1003.767041855" Jan 27 11:37:04 crc kubenswrapper[4775]: I0127 11:37:04.901576 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.042285 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-combined-ca-bundle\") pod \"71de6180-54da-4c3b-8aea-73a2ccfd936a\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.042432 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-config-data\") pod \"71de6180-54da-4c3b-8aea-73a2ccfd936a\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.042490 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-777qq\" (UniqueName: \"kubernetes.io/projected/71de6180-54da-4c3b-8aea-73a2ccfd936a-kube-api-access-777qq\") pod \"71de6180-54da-4c3b-8aea-73a2ccfd936a\" (UID: \"71de6180-54da-4c3b-8aea-73a2ccfd936a\") " Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.047649 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71de6180-54da-4c3b-8aea-73a2ccfd936a-kube-api-access-777qq" (OuterVolumeSpecName: "kube-api-access-777qq") pod "71de6180-54da-4c3b-8aea-73a2ccfd936a" (UID: "71de6180-54da-4c3b-8aea-73a2ccfd936a"). InnerVolumeSpecName "kube-api-access-777qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.065777 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71de6180-54da-4c3b-8aea-73a2ccfd936a" (UID: "71de6180-54da-4c3b-8aea-73a2ccfd936a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.115809 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-config-data" (OuterVolumeSpecName: "config-data") pod "71de6180-54da-4c3b-8aea-73a2ccfd936a" (UID: "71de6180-54da-4c3b-8aea-73a2ccfd936a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.144792 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.145156 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71de6180-54da-4c3b-8aea-73a2ccfd936a-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.145429 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-777qq\" (UniqueName: \"kubernetes.io/projected/71de6180-54da-4c3b-8aea-73a2ccfd936a-kube-api-access-777qq\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.611375 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kc6bw" event={"ID":"71de6180-54da-4c3b-8aea-73a2ccfd936a","Type":"ContainerDied","Data":"11caf690b8e6315d486b022511a24646b3a13ddeba5aaf5fcd9be6d3ffa4371e"} Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.611499 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11caf690b8e6315d486b022511a24646b3a13ddeba5aaf5fcd9be6d3ffa4371e" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.611555 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.611390 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kc6bw" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.876432 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4jgdk"] Jan 27 11:37:05 crc kubenswrapper[4775]: E0127 11:37:05.876832 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71de6180-54da-4c3b-8aea-73a2ccfd936a" containerName="keystone-db-sync" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.876851 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="71de6180-54da-4c3b-8aea-73a2ccfd936a" containerName="keystone-db-sync" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.877097 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="71de6180-54da-4c3b-8aea-73a2ccfd936a" containerName="keystone-db-sync" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.877705 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.884210 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.886039 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.886050 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.887496 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.894704 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-btkr8" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.912074 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-kbm75"] Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.931702 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4jgdk"] Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.959844 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-scripts\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.959913 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-fernet-keys\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.959952 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-credential-keys\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.959977 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-combined-ca-bundle\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.960002 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wwxq\" (UniqueName: \"kubernetes.io/projected/e6fc373f-0642-464e-81c9-b78a27dfebbe-kube-api-access-7wwxq\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:05 crc kubenswrapper[4775]: I0127 11:37:05.960031 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-config-data\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.031991 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-x27vf"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.033275 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.060937 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-scripts\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.061002 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-fernet-keys\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.061042 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-credential-keys\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.061075 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-combined-ca-bundle\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.061099 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wwxq\" (UniqueName: \"kubernetes.io/projected/e6fc373f-0642-464e-81c9-b78a27dfebbe-kube-api-access-7wwxq\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.061132 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-config-data\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.083005 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-x27vf"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.091778 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-credential-keys\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.092354 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-fernet-keys\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.092494 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-scripts\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.094242 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-combined-ca-bundle\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.128182 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-config-data\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.139058 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wwxq\" (UniqueName: \"kubernetes.io/projected/e6fc373f-0642-464e-81c9-b78a27dfebbe-kube-api-access-7wwxq\") pod \"keystone-bootstrap-4jgdk\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.162286 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-svc\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.162354 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-sb\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.162429 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-nb\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.162465 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-config\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.162487 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-swift-storage-0\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.162512 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j59w\" (UniqueName: \"kubernetes.io/projected/3d2d75af-356d-4928-82f7-3555df136fac-kube-api-access-9j59w\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.195801 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.220537 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c78fd876f-8p4lr"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.221754 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.235680 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-rrpqs" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.235943 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.236499 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.236639 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.264380 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c78fd876f-8p4lr"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279205 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-config\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279253 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-swift-storage-0\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279281 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j59w\" (UniqueName: \"kubernetes.io/projected/3d2d75af-356d-4928-82f7-3555df136fac-kube-api-access-9j59w\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279318 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-svc\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279337 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-config-data\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279362 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krmx7\" (UniqueName: \"kubernetes.io/projected/29a2a294-6d96-4169-9be8-7109251bf8b1-kube-api-access-krmx7\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279392 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-sb\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279416 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-scripts\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279443 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29a2a294-6d96-4169-9be8-7109251bf8b1-logs\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279481 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/29a2a294-6d96-4169-9be8-7109251bf8b1-horizon-secret-key\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.279545 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-nb\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.280137 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-config\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.284949 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-nb\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.285504 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-swift-storage-0\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.285941 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-sb\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.286427 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-svc\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.328315 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j59w\" (UniqueName: \"kubernetes.io/projected/3d2d75af-356d-4928-82f7-3555df136fac-kube-api-access-9j59w\") pod \"dnsmasq-dns-58647bbf65-x27vf\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.349392 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.369964 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.382060 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-config-data\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.382110 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krmx7\" (UniqueName: \"kubernetes.io/projected/29a2a294-6d96-4169-9be8-7109251bf8b1-kube-api-access-krmx7\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.382144 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-scripts\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.382173 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29a2a294-6d96-4169-9be8-7109251bf8b1-logs\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.382198 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/29a2a294-6d96-4169-9be8-7109251bf8b1-horizon-secret-key\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.385944 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.386058 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-scripts\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.386777 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29a2a294-6d96-4169-9be8-7109251bf8b1-logs\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.387036 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-config-data\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.387247 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/29a2a294-6d96-4169-9be8-7109251bf8b1-horizon-secret-key\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.392392 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.392607 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.413766 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-xbnrk"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.414694 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.420959 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.421138 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.425937 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-dtgzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.428731 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krmx7\" (UniqueName: \"kubernetes.io/projected/29a2a294-6d96-4169-9be8-7109251bf8b1-kube-api-access-krmx7\") pod \"horizon-7c78fd876f-8p4lr\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.452365 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-2nfbz"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.453503 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.461709 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4b27z" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.461964 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.478959 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484201 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-db-sync-config-data\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484239 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-config-data\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484258 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2029cc7b-c115-4c17-8713-c6eed291e963-etc-machine-id\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484280 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484299 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-db-sync-config-data\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484317 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-scripts\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484351 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484371 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7xk8\" (UniqueName: \"kubernetes.io/projected/2029cc7b-c115-4c17-8713-c6eed291e963-kube-api-access-h7xk8\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484406 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftqgp\" (UniqueName: \"kubernetes.io/projected/0edaeaa2-aa90-484f-854c-db5dd181f61b-kube-api-access-ftqgp\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484421 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-config-data\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484470 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-run-httpd\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484487 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-log-httpd\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484506 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5n55\" (UniqueName: \"kubernetes.io/projected/8a82d041-4b07-491a-8af6-232e67a23299-kube-api-access-z5n55\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484522 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-scripts\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484542 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-combined-ca-bundle\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.484565 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-combined-ca-bundle\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.485518 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-2nfbz"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.496362 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xbnrk"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.514332 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58cf66fb49-4l4kc"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.516058 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.532505 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58cf66fb49-4l4kc"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.550734 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-x27vf"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.583829 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.590569 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2029cc7b-c115-4c17-8713-c6eed291e963-etc-machine-id\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.590684 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.590790 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-db-sync-config-data\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.590911 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-scripts\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.591059 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.591226 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7xk8\" (UniqueName: \"kubernetes.io/projected/2029cc7b-c115-4c17-8713-c6eed291e963-kube-api-access-h7xk8\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.593032 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-config-data\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.593239 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngfh4\" (UniqueName: \"kubernetes.io/projected/c73cda8b-d244-4ad1-8f54-f5680565327d-kube-api-access-ngfh4\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.593422 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftqgp\" (UniqueName: \"kubernetes.io/projected/0edaeaa2-aa90-484f-854c-db5dd181f61b-kube-api-access-ftqgp\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.593565 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-config-data\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.593723 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-scripts\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.593844 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-run-httpd\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.594009 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-log-httpd\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.594161 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5n55\" (UniqueName: \"kubernetes.io/projected/8a82d041-4b07-491a-8af6-232e67a23299-kube-api-access-z5n55\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.594323 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-scripts\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.594432 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c73cda8b-d244-4ad1-8f54-f5680565327d-logs\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.594598 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-combined-ca-bundle\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.594706 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-combined-ca-bundle\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.594846 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-db-sync-config-data\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.595057 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c73cda8b-d244-4ad1-8f54-f5680565327d-horizon-secret-key\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.595171 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-config-data\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.613025 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-run-httpd\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.615522 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.615589 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2029cc7b-c115-4c17-8713-c6eed291e963-etc-machine-id\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.618353 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-combined-ca-bundle\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.622254 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.624658 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-scripts\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.624718 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-log-httpd\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.634584 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-db-sync-config-data\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.636546 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-combined-ca-bundle\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.640725 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-scripts\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.641054 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-db-sync-config-data\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.641964 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-config-data\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.643736 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-99pzl"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.646294 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.648236 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5n55\" (UniqueName: \"kubernetes.io/projected/8a82d041-4b07-491a-8af6-232e67a23299-kube-api-access-z5n55\") pod \"ceilometer-0\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.649251 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.650962 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7xk8\" (UniqueName: \"kubernetes.io/projected/2029cc7b-c115-4c17-8713-c6eed291e963-kube-api-access-h7xk8\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.654161 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.654574 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rl8gp" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.681546 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-config-data\") pod \"cinder-db-sync-xbnrk\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.689051 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftqgp\" (UniqueName: \"kubernetes.io/projected/0edaeaa2-aa90-484f-854c-db5dd181f61b-kube-api-access-ftqgp\") pod \"barbican-db-sync-2nfbz\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.693622 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fd458c8cc-7vmm5"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.695958 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697146 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-config-data\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697186 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngfh4\" (UniqueName: \"kubernetes.io/projected/c73cda8b-d244-4ad1-8f54-f5680565327d-kube-api-access-ngfh4\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697255 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-scripts\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697317 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c73cda8b-d244-4ad1-8f54-f5680565327d-logs\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697341 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-config\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697396 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvd5r\" (UniqueName: \"kubernetes.io/projected/73aaf8f0-0380-4eff-875b-90da115dba37-kube-api-access-kvd5r\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697422 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-combined-ca-bundle\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.697475 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c73cda8b-d244-4ad1-8f54-f5680565327d-horizon-secret-key\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.700243 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-config-data\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.704725 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c73cda8b-d244-4ad1-8f54-f5680565327d-logs\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.704767 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-scripts\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.707976 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c73cda8b-d244-4ad1-8f54-f5680565327d-horizon-secret-key\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.710044 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-99pzl"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.715993 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngfh4\" (UniqueName: \"kubernetes.io/projected/c73cda8b-d244-4ad1-8f54-f5680565327d-kube-api-access-ngfh4\") pod \"horizon-58cf66fb49-4l4kc\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.716340 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.720025 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fd458c8cc-7vmm5"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.728712 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-74wvb"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.730192 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.732397 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.732768 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7dndl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.733042 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.750051 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-74wvb"] Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.762781 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.791365 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.799443 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-config\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.799500 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-scripts\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.799576 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-config\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.799595 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-config-data\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800122 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-swift-storage-0\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800153 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c313125-cfde-424b-9bb3-acb232d20ba3-logs\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800171 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvd5r\" (UniqueName: \"kubernetes.io/projected/73aaf8f0-0380-4eff-875b-90da115dba37-kube-api-access-kvd5r\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800191 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-combined-ca-bundle\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800215 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-combined-ca-bundle\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800237 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-524jt\" (UniqueName: \"kubernetes.io/projected/0276dc98-8972-465b-bf5a-e222c73eb8a0-kube-api-access-524jt\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800319 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lmq8\" (UniqueName: \"kubernetes.io/projected/5c313125-cfde-424b-9bb3-acb232d20ba3-kube-api-access-9lmq8\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800338 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-nb\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800362 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-svc\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.800382 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-sb\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.804071 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-config\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.806205 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-combined-ca-bundle\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.818408 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvd5r\" (UniqueName: \"kubernetes.io/projected/73aaf8f0-0380-4eff-875b-90da115dba37-kube-api-access-kvd5r\") pod \"neutron-db-sync-99pzl\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.854374 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902123 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lmq8\" (UniqueName: \"kubernetes.io/projected/5c313125-cfde-424b-9bb3-acb232d20ba3-kube-api-access-9lmq8\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902166 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-nb\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902196 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-svc\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902216 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-sb\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902271 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-config\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902292 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-scripts\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902317 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-config-data\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902340 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-swift-storage-0\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902363 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c313125-cfde-424b-9bb3-acb232d20ba3-logs\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902388 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-combined-ca-bundle\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.902409 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-524jt\" (UniqueName: \"kubernetes.io/projected/0276dc98-8972-465b-bf5a-e222c73eb8a0-kube-api-access-524jt\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.903071 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c313125-cfde-424b-9bb3-acb232d20ba3-logs\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.903415 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-swift-storage-0\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.903534 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-svc\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.903702 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-config\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.903934 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-nb\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.904209 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-sb\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.912749 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-scripts\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.916183 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-combined-ca-bundle\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.923529 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-config-data\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.929511 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-524jt\" (UniqueName: \"kubernetes.io/projected/0276dc98-8972-465b-bf5a-e222c73eb8a0-kube-api-access-524jt\") pod \"dnsmasq-dns-fd458c8cc-7vmm5\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.930849 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lmq8\" (UniqueName: \"kubernetes.io/projected/5c313125-cfde-424b-9bb3-acb232d20ba3-kube-api-access-9lmq8\") pod \"placement-db-sync-74wvb\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:06 crc kubenswrapper[4775]: I0127 11:37:06.988290 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.019126 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.065070 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.088420 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-x27vf"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.104339 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4jgdk"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.331215 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c78fd876f-8p4lr"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.368559 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-xbnrk"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.663126 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-2nfbz"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.665413 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4jgdk" event={"ID":"e6fc373f-0642-464e-81c9-b78a27dfebbe","Type":"ContainerStarted","Data":"4ce590bb5f4400c025e962ac68e489a2398ff50958ad307497f1941c218a2bb9"} Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.667592 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c78fd876f-8p4lr" event={"ID":"29a2a294-6d96-4169-9be8-7109251bf8b1","Type":"ContainerStarted","Data":"e85e8e0f44ac4f6cbdc0a4bbf06db8528c1a4b4037fff448eea7f2f74eae3616"} Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.668628 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xbnrk" event={"ID":"2029cc7b-c115-4c17-8713-c6eed291e963","Type":"ContainerStarted","Data":"3bc9015e48f89109be48fe8277a72545dd42d19ee96ca2b3cb7712694284f3b0"} Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.669467 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" podUID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerName="dnsmasq-dns" containerID="cri-o://0a1ee926e84e3e35b872d88cb08d96d4de13299dd229ad87fd274d8070f6832d" gracePeriod=10 Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.669683 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58647bbf65-x27vf" event={"ID":"3d2d75af-356d-4928-82f7-3555df136fac","Type":"ContainerStarted","Data":"0ec26ab5f4eca94fe87dc449d18ade5b38eb6a8d9143ab8a2b319169b716216e"} Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.684308 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.734801 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58cf66fb49-4l4kc"] Jan 27 11:37:07 crc kubenswrapper[4775]: W0127 11:37:07.747136 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc73cda8b_d244_4ad1_8f54_f5680565327d.slice/crio-c249bdd94a125524e988795b71a7762c676a0ef2577e0640b92316f827a03d2f WatchSource:0}: Error finding container c249bdd94a125524e988795b71a7762c676a0ef2577e0640b92316f827a03d2f: Status 404 returned error can't find the container with id c249bdd94a125524e988795b71a7762c676a0ef2577e0640b92316f827a03d2f Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.772870 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-99pzl"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.791383 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fd458c8cc-7vmm5"] Jan 27 11:37:07 crc kubenswrapper[4775]: I0127 11:37:07.954143 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-74wvb"] Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.143174 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58cf66fb49-4l4kc"] Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.161424 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.168888 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5f6cd994f7-2jm86"] Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.178154 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.229060 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f6cd994f7-2jm86"] Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.239055 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqxbs\" (UniqueName: \"kubernetes.io/projected/dd14daeb-9a49-4720-9c96-b6caf1257d5a-kube-api-access-jqxbs\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.239110 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd14daeb-9a49-4720-9c96-b6caf1257d5a-logs\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.239152 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-config-data\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.239184 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd14daeb-9a49-4720-9c96-b6caf1257d5a-horizon-secret-key\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.239219 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-scripts\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.340558 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqxbs\" (UniqueName: \"kubernetes.io/projected/dd14daeb-9a49-4720-9c96-b6caf1257d5a-kube-api-access-jqxbs\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.340608 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd14daeb-9a49-4720-9c96-b6caf1257d5a-logs\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.340651 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-config-data\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.340680 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd14daeb-9a49-4720-9c96-b6caf1257d5a-horizon-secret-key\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.340712 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-scripts\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.341411 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-scripts\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.341853 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd14daeb-9a49-4720-9c96-b6caf1257d5a-logs\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.342889 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-config-data\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.356635 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd14daeb-9a49-4720-9c96-b6caf1257d5a-horizon-secret-key\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.417312 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqxbs\" (UniqueName: \"kubernetes.io/projected/dd14daeb-9a49-4720-9c96-b6caf1257d5a-kube-api-access-jqxbs\") pod \"horizon-5f6cd994f7-2jm86\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.538137 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.677355 4775 generic.go:334] "Generic (PLEG): container finished" podID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerID="0a1ee926e84e3e35b872d88cb08d96d4de13299dd229ad87fd274d8070f6832d" exitCode=0 Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.677428 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" event={"ID":"77bba6d5-b2fc-4cb1-a104-f61fb146ae66","Type":"ContainerDied","Data":"0a1ee926e84e3e35b872d88cb08d96d4de13299dd229ad87fd274d8070f6832d"} Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.679101 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2nfbz" event={"ID":"0edaeaa2-aa90-484f-854c-db5dd181f61b","Type":"ContainerStarted","Data":"9170c8f0fe1b93f735c76c15f9a93fc8d92b886973d63e04084aa00a5cbc88dd"} Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.680748 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-99pzl" event={"ID":"73aaf8f0-0380-4eff-875b-90da115dba37","Type":"ContainerStarted","Data":"86cd01583ba668ca9ee9332ef9ae7b46a7e4472e08a57756b024346b5439a7c6"} Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.681674 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerStarted","Data":"a1da85b3df4788f571e86de3391158e11cf2502b74702f3be38ea8d5b9dea0f2"} Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.683467 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cf66fb49-4l4kc" event={"ID":"c73cda8b-d244-4ad1-8f54-f5680565327d","Type":"ContainerStarted","Data":"c249bdd94a125524e988795b71a7762c676a0ef2577e0640b92316f827a03d2f"} Jan 27 11:37:08 crc kubenswrapper[4775]: I0127 11:37:08.684994 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" event={"ID":"0276dc98-8972-465b-bf5a-e222c73eb8a0","Type":"ContainerStarted","Data":"a084fdcf587e56beb20131ec45402a052d2991a945867b6c6e9adfa05c842c39"} Jan 27 11:37:10 crc kubenswrapper[4775]: W0127 11:37:10.093789 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c313125_cfde_424b_9bb3_acb232d20ba3.slice/crio-b0bc4a39609e848a771abb9f53ba789c3ab85ce7b53e0dfd4f329f9af932dba1 WatchSource:0}: Error finding container b0bc4a39609e848a771abb9f53ba789c3ab85ce7b53e0dfd4f329f9af932dba1: Status 404 returned error can't find the container with id b0bc4a39609e848a771abb9f53ba789c3ab85ce7b53e0dfd4f329f9af932dba1 Jan 27 11:37:10 crc kubenswrapper[4775]: I0127 11:37:10.600937 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5f6cd994f7-2jm86"] Jan 27 11:37:10 crc kubenswrapper[4775]: I0127 11:37:10.699510 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-74wvb" event={"ID":"5c313125-cfde-424b-9bb3-acb232d20ba3","Type":"ContainerStarted","Data":"b0bc4a39609e848a771abb9f53ba789c3ab85ce7b53e0dfd4f329f9af932dba1"} Jan 27 11:37:10 crc kubenswrapper[4775]: I0127 11:37:10.700636 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6cd994f7-2jm86" event={"ID":"dd14daeb-9a49-4720-9c96-b6caf1257d5a","Type":"ContainerStarted","Data":"96c45e8e9930bf07afed2f11987b0afd9b083256c7c2af2e8c36913249d87fa8"} Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.453987 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.598830 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-swift-storage-0\") pod \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.598931 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wtgz\" (UniqueName: \"kubernetes.io/projected/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-kube-api-access-6wtgz\") pod \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.599009 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-sb\") pod \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.599033 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-nb\") pod \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.599092 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-config\") pod \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.599131 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-svc\") pod \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\" (UID: \"77bba6d5-b2fc-4cb1-a104-f61fb146ae66\") " Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.617905 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-kube-api-access-6wtgz" (OuterVolumeSpecName: "kube-api-access-6wtgz") pod "77bba6d5-b2fc-4cb1-a104-f61fb146ae66" (UID: "77bba6d5-b2fc-4cb1-a104-f61fb146ae66"). InnerVolumeSpecName "kube-api-access-6wtgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.684626 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "77bba6d5-b2fc-4cb1-a104-f61fb146ae66" (UID: "77bba6d5-b2fc-4cb1-a104-f61fb146ae66"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.702210 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wtgz\" (UniqueName: \"kubernetes.io/projected/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-kube-api-access-6wtgz\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.702247 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.707984 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "77bba6d5-b2fc-4cb1-a104-f61fb146ae66" (UID: "77bba6d5-b2fc-4cb1-a104-f61fb146ae66"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.710199 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "77bba6d5-b2fc-4cb1-a104-f61fb146ae66" (UID: "77bba6d5-b2fc-4cb1-a104-f61fb146ae66"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.723290 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "77bba6d5-b2fc-4cb1-a104-f61fb146ae66" (UID: "77bba6d5-b2fc-4cb1-a104-f61fb146ae66"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.732302 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" event={"ID":"77bba6d5-b2fc-4cb1-a104-f61fb146ae66","Type":"ContainerDied","Data":"f0234531f1a183116689c44e0aec6118543c01688ecd27a430619636f971edf6"} Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.732375 4775 scope.go:117] "RemoveContainer" containerID="0a1ee926e84e3e35b872d88cb08d96d4de13299dd229ad87fd274d8070f6832d" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.732525 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8467b54bcc-kbm75" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.732743 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-config" (OuterVolumeSpecName: "config") pod "77bba6d5-b2fc-4cb1-a104-f61fb146ae66" (UID: "77bba6d5-b2fc-4cb1-a104-f61fb146ae66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.771689 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4jgdk" event={"ID":"e6fc373f-0642-464e-81c9-b78a27dfebbe","Type":"ContainerStarted","Data":"4cde95c13e106ae0baf2b7a5b06242a46ab07d950f57252253895801adba497a"} Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.771729 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-99pzl" event={"ID":"73aaf8f0-0380-4eff-875b-90da115dba37","Type":"ContainerStarted","Data":"99a5cb170850c0b63e27c950fae2217adb226000e7879b0d85d00d895a615bdf"} Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.771795 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sd44h" event={"ID":"ca5aab7c-3b7a-4996-82f5-478d4100bb6c","Type":"ContainerStarted","Data":"9f638d9da6983bb9f837a053db11c7b530800ce81cdfc56efc5cba5e158a333e"} Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.780377 4775 scope.go:117] "RemoveContainer" containerID="e1beaaa4695018ded22e4c66ef6c8ed9e50da3ff9c5013e8aa00be310e0383e9" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.797487 4775 generic.go:334] "Generic (PLEG): container finished" podID="3d2d75af-356d-4928-82f7-3555df136fac" containerID="546bed47b4c9b681b5faa13f7edb70b50e0378571f8957808614c8f8940092a3" exitCode=0 Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.800033 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58647bbf65-x27vf" event={"ID":"3d2d75af-356d-4928-82f7-3555df136fac","Type":"ContainerDied","Data":"546bed47b4c9b681b5faa13f7edb70b50e0378571f8957808614c8f8940092a3"} Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.815294 4775 generic.go:334] "Generic (PLEG): container finished" podID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerID="de00e87fa01e98a6d0ad8af61db692885f2ec794526f456407919d9326501ccd" exitCode=0 Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.815665 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" event={"ID":"0276dc98-8972-465b-bf5a-e222c73eb8a0","Type":"ContainerDied","Data":"de00e87fa01e98a6d0ad8af61db692885f2ec794526f456407919d9326501ccd"} Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.833537 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.833576 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.833586 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:11 crc kubenswrapper[4775]: I0127 11:37:11.833600 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/77bba6d5-b2fc-4cb1-a104-f61fb146ae66-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.064549 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4jgdk" podStartSLOduration=7.064531024 podStartE2EDuration="7.064531024s" podCreationTimestamp="2026-01-27 11:37:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:12.036233587 +0000 UTC m=+1011.177831364" watchObservedRunningTime="2026-01-27 11:37:12.064531024 +0000 UTC m=+1011.206128801" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.076385 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-sd44h" podStartSLOduration=6.726539738 podStartE2EDuration="34.06843417s" podCreationTimestamp="2026-01-27 11:36:38 +0000 UTC" firstStartedPulling="2026-01-27 11:36:38.992139762 +0000 UTC m=+978.133737539" lastFinishedPulling="2026-01-27 11:37:06.334034204 +0000 UTC m=+1005.475631971" observedRunningTime="2026-01-27 11:37:12.064085341 +0000 UTC m=+1011.205683118" watchObservedRunningTime="2026-01-27 11:37:12.06843417 +0000 UTC m=+1011.210031947" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.090062 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-99pzl" podStartSLOduration=6.090045984 podStartE2EDuration="6.090045984s" podCreationTimestamp="2026-01-27 11:37:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:12.081120198 +0000 UTC m=+1011.222717985" watchObservedRunningTime="2026-01-27 11:37:12.090045984 +0000 UTC m=+1011.231643761" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.176543 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-kbm75"] Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.186731 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8467b54bcc-kbm75"] Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.187845 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.249517 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-nb\") pod \"3d2d75af-356d-4928-82f7-3555df136fac\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.249965 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-sb\") pod \"3d2d75af-356d-4928-82f7-3555df136fac\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.250017 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-swift-storage-0\") pod \"3d2d75af-356d-4928-82f7-3555df136fac\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.250043 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-svc\") pod \"3d2d75af-356d-4928-82f7-3555df136fac\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.250076 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-config\") pod \"3d2d75af-356d-4928-82f7-3555df136fac\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.250121 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j59w\" (UniqueName: \"kubernetes.io/projected/3d2d75af-356d-4928-82f7-3555df136fac-kube-api-access-9j59w\") pod \"3d2d75af-356d-4928-82f7-3555df136fac\" (UID: \"3d2d75af-356d-4928-82f7-3555df136fac\") " Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.265129 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d2d75af-356d-4928-82f7-3555df136fac-kube-api-access-9j59w" (OuterVolumeSpecName: "kube-api-access-9j59w") pod "3d2d75af-356d-4928-82f7-3555df136fac" (UID: "3d2d75af-356d-4928-82f7-3555df136fac"). InnerVolumeSpecName "kube-api-access-9j59w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.273405 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3d2d75af-356d-4928-82f7-3555df136fac" (UID: "3d2d75af-356d-4928-82f7-3555df136fac"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.287866 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3d2d75af-356d-4928-82f7-3555df136fac" (UID: "3d2d75af-356d-4928-82f7-3555df136fac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.306251 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3d2d75af-356d-4928-82f7-3555df136fac" (UID: "3d2d75af-356d-4928-82f7-3555df136fac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.306319 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-config" (OuterVolumeSpecName: "config") pod "3d2d75af-356d-4928-82f7-3555df136fac" (UID: "3d2d75af-356d-4928-82f7-3555df136fac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.307778 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3d2d75af-356d-4928-82f7-3555df136fac" (UID: "3d2d75af-356d-4928-82f7-3555df136fac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.352600 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.352630 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.352659 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.352670 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.352681 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d2d75af-356d-4928-82f7-3555df136fac-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.352691 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j59w\" (UniqueName: \"kubernetes.io/projected/3d2d75af-356d-4928-82f7-3555df136fac-kube-api-access-9j59w\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.849943 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" event={"ID":"0276dc98-8972-465b-bf5a-e222c73eb8a0","Type":"ContainerStarted","Data":"d3b7ef27bbcce0f78db6507da3665b881a0a8b58ddbc436efa6b111cc5cd68a1"} Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.850142 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.856836 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58647bbf65-x27vf" event={"ID":"3d2d75af-356d-4928-82f7-3555df136fac","Type":"ContainerDied","Data":"0ec26ab5f4eca94fe87dc449d18ade5b38eb6a8d9143ab8a2b319169b716216e"} Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.856887 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58647bbf65-x27vf" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.856890 4775 scope.go:117] "RemoveContainer" containerID="546bed47b4c9b681b5faa13f7edb70b50e0378571f8957808614c8f8940092a3" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.877023 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" podStartSLOduration=6.876998855 podStartE2EDuration="6.876998855s" podCreationTimestamp="2026-01-27 11:37:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:12.867976087 +0000 UTC m=+1012.009573854" watchObservedRunningTime="2026-01-27 11:37:12.876998855 +0000 UTC m=+1012.018596632" Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.942984 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-x27vf"] Jan 27 11:37:12 crc kubenswrapper[4775]: I0127 11:37:12.944419 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58647bbf65-x27vf"] Jan 27 11:37:13 crc kubenswrapper[4775]: I0127 11:37:13.803481 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d2d75af-356d-4928-82f7-3555df136fac" path="/var/lib/kubelet/pods/3d2d75af-356d-4928-82f7-3555df136fac/volumes" Jan 27 11:37:13 crc kubenswrapper[4775]: I0127 11:37:13.805055 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" path="/var/lib/kubelet/pods/77bba6d5-b2fc-4cb1-a104-f61fb146ae66/volumes" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.898753 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c78fd876f-8p4lr"] Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.928633 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-84666cddfd-6l8vq"] Jan 27 11:37:14 crc kubenswrapper[4775]: E0127 11:37:14.929251 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerName="init" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.929268 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerName="init" Jan 27 11:37:14 crc kubenswrapper[4775]: E0127 11:37:14.929283 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerName="dnsmasq-dns" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.929291 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerName="dnsmasq-dns" Jan 27 11:37:14 crc kubenswrapper[4775]: E0127 11:37:14.929319 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d2d75af-356d-4928-82f7-3555df136fac" containerName="init" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.929324 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d2d75af-356d-4928-82f7-3555df136fac" containerName="init" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.929521 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="77bba6d5-b2fc-4cb1-a104-f61fb146ae66" containerName="dnsmasq-dns" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.930766 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d2d75af-356d-4928-82f7-3555df136fac" containerName="init" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.931742 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.937128 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.945663 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84666cddfd-6l8vq"] Jan 27 11:37:14 crc kubenswrapper[4775]: I0127 11:37:14.979256 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f6cd994f7-2jm86"] Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.002609 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-scripts\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.002745 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-config-data\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.002778 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-tls-certs\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.002882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-combined-ca-bundle\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.002942 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-logs\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.002993 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs2dr\" (UniqueName: \"kubernetes.io/projected/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-kube-api-access-xs2dr\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.003064 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-secret-key\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.018444 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6546ffcc78-4zdnk"] Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.021123 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.035064 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6546ffcc78-4zdnk"] Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.104486 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-logs\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.104531 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs2dr\" (UniqueName: \"kubernetes.io/projected/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-kube-api-access-xs2dr\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.104593 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-secret-key\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.104650 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-scripts\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.105155 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-logs\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.106000 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-scripts\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.107229 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-config-data\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.107305 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-tls-certs\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.107393 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-combined-ca-bundle\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.108741 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-config-data\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.109758 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-secret-key\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.114925 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-combined-ca-bundle\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.126862 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-tls-certs\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.208344 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-config-data\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.208433 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-combined-ca-bundle\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.208474 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-horizon-tls-certs\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.208541 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-horizon-secret-key\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.208587 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-scripts\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.208638 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcwpt\" (UniqueName: \"kubernetes.io/projected/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-kube-api-access-tcwpt\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.208666 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-logs\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.221911 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs2dr\" (UniqueName: \"kubernetes.io/projected/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-kube-api-access-xs2dr\") pod \"horizon-84666cddfd-6l8vq\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.252275 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.309530 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-config-data\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.309596 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-combined-ca-bundle\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.309624 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-horizon-tls-certs\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.309685 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-horizon-secret-key\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.309730 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-scripts\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.309765 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcwpt\" (UniqueName: \"kubernetes.io/projected/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-kube-api-access-tcwpt\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.309812 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-logs\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.310218 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-logs\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.310602 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-scripts\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.310686 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-config-data\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.319901 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-combined-ca-bundle\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.330105 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-horizon-secret-key\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.332610 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-horizon-tls-certs\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.333554 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcwpt\" (UniqueName: \"kubernetes.io/projected/00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4-kube-api-access-tcwpt\") pod \"horizon-6546ffcc78-4zdnk\" (UID: \"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4\") " pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.346341 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.944009 4775 generic.go:334] "Generic (PLEG): container finished" podID="e6fc373f-0642-464e-81c9-b78a27dfebbe" containerID="4cde95c13e106ae0baf2b7a5b06242a46ab07d950f57252253895801adba497a" exitCode=0 Jan 27 11:37:15 crc kubenswrapper[4775]: I0127 11:37:15.944324 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4jgdk" event={"ID":"e6fc373f-0642-464e-81c9-b78a27dfebbe","Type":"ContainerDied","Data":"4cde95c13e106ae0baf2b7a5b06242a46ab07d950f57252253895801adba497a"} Jan 27 11:37:17 crc kubenswrapper[4775]: I0127 11:37:17.022247 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:17 crc kubenswrapper[4775]: I0127 11:37:17.087577 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-xrw7x"] Jan 27 11:37:17 crc kubenswrapper[4775]: I0127 11:37:17.087787 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" containerID="cri-o://5e752a3391827672fa37b60e71b5a6f3c1262d98795c1a40cb7662f381943f34" gracePeriod=10 Jan 27 11:37:17 crc kubenswrapper[4775]: I0127 11:37:17.969712 4775 generic.go:334] "Generic (PLEG): container finished" podID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerID="5e752a3391827672fa37b60e71b5a6f3c1262d98795c1a40cb7662f381943f34" exitCode=0 Jan 27 11:37:17 crc kubenswrapper[4775]: I0127 11:37:17.969751 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" event={"ID":"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8","Type":"ContainerDied","Data":"5e752a3391827672fa37b60e71b5a6f3c1262d98795c1a40cb7662f381943f34"} Jan 27 11:37:20 crc kubenswrapper[4775]: I0127 11:37:20.133639 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Jan 27 11:37:25 crc kubenswrapper[4775]: I0127 11:37:25.133439 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: connect: connection refused" Jan 27 11:37:26 crc kubenswrapper[4775]: E0127 11:37:26.808270 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b" Jan 27 11:37:26 crc kubenswrapper[4775]: E0127 11:37:26.808483 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9lmq8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-74wvb_openstack(5c313125-cfde-424b-9bb3-acb232d20ba3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:37:26 crc kubenswrapper[4775]: E0127 11:37:26.809667 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-74wvb" podUID="5c313125-cfde-424b-9bb3-acb232d20ba3" Jan 27 11:37:27 crc kubenswrapper[4775]: E0127 11:37:27.050505 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:33f4e5f7a715d48482ec46a42267ea992fa268585303c4f1bd3cbea072a6348b\\\"\"" pod="openstack/placement-db-sync-74wvb" podUID="5c313125-cfde-424b-9bb3-acb232d20ba3" Jan 27 11:37:27 crc kubenswrapper[4775]: E0127 11:37:27.195966 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777" Jan 27 11:37:27 crc kubenswrapper[4775]: E0127 11:37:27.196138 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5a548c25fe3d02f7a042cb0a6d28fc8039a34c4a3b3d07aadda4aba3a926e777,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n657h58dh59bh7bh68dhc8h5b9h569h564h75h75hb7hd6h597h675hb9h556h649h689h54dh554h8dh7bh9h8dh5f7h5b4h7bh5fdh8dh94h59fq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z5n55,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(8a82d041-4b07-491a-8af6-232e67a23299): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.350603 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.482886 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wwxq\" (UniqueName: \"kubernetes.io/projected/e6fc373f-0642-464e-81c9-b78a27dfebbe-kube-api-access-7wwxq\") pod \"e6fc373f-0642-464e-81c9-b78a27dfebbe\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.482969 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-scripts\") pod \"e6fc373f-0642-464e-81c9-b78a27dfebbe\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.482991 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-credential-keys\") pod \"e6fc373f-0642-464e-81c9-b78a27dfebbe\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.483067 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-combined-ca-bundle\") pod \"e6fc373f-0642-464e-81c9-b78a27dfebbe\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.483111 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-fernet-keys\") pod \"e6fc373f-0642-464e-81c9-b78a27dfebbe\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.483149 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-config-data\") pod \"e6fc373f-0642-464e-81c9-b78a27dfebbe\" (UID: \"e6fc373f-0642-464e-81c9-b78a27dfebbe\") " Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.489661 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e6fc373f-0642-464e-81c9-b78a27dfebbe" (UID: "e6fc373f-0642-464e-81c9-b78a27dfebbe"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.491313 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e6fc373f-0642-464e-81c9-b78a27dfebbe" (UID: "e6fc373f-0642-464e-81c9-b78a27dfebbe"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.494274 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6fc373f-0642-464e-81c9-b78a27dfebbe-kube-api-access-7wwxq" (OuterVolumeSpecName: "kube-api-access-7wwxq") pod "e6fc373f-0642-464e-81c9-b78a27dfebbe" (UID: "e6fc373f-0642-464e-81c9-b78a27dfebbe"). InnerVolumeSpecName "kube-api-access-7wwxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.509268 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-scripts" (OuterVolumeSpecName: "scripts") pod "e6fc373f-0642-464e-81c9-b78a27dfebbe" (UID: "e6fc373f-0642-464e-81c9-b78a27dfebbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.510570 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6fc373f-0642-464e-81c9-b78a27dfebbe" (UID: "e6fc373f-0642-464e-81c9-b78a27dfebbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.514217 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-config-data" (OuterVolumeSpecName: "config-data") pod "e6fc373f-0642-464e-81c9-b78a27dfebbe" (UID: "e6fc373f-0642-464e-81c9-b78a27dfebbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.586028 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wwxq\" (UniqueName: \"kubernetes.io/projected/e6fc373f-0642-464e-81c9-b78a27dfebbe-kube-api-access-7wwxq\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.586181 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.586273 4775 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.586357 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.586432 4775 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:27 crc kubenswrapper[4775]: I0127 11:37:27.586567 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6fc373f-0642-464e-81c9-b78a27dfebbe-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.056439 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4jgdk" event={"ID":"e6fc373f-0642-464e-81c9-b78a27dfebbe","Type":"ContainerDied","Data":"4ce590bb5f4400c025e962ac68e489a2398ff50958ad307497f1941c218a2bb9"} Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.056492 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ce590bb5f4400c025e962ac68e489a2398ff50958ad307497f1941c218a2bb9" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.056550 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4jgdk" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.059143 4775 generic.go:334] "Generic (PLEG): container finished" podID="ca5aab7c-3b7a-4996-82f5-478d4100bb6c" containerID="9f638d9da6983bb9f837a053db11c7b530800ce81cdfc56efc5cba5e158a333e" exitCode=0 Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.059175 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sd44h" event={"ID":"ca5aab7c-3b7a-4996-82f5-478d4100bb6c","Type":"ContainerDied","Data":"9f638d9da6983bb9f837a053db11c7b530800ce81cdfc56efc5cba5e158a333e"} Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.442345 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4jgdk"] Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.451080 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4jgdk"] Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.535228 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-gcjrx"] Jan 27 11:37:28 crc kubenswrapper[4775]: E0127 11:37:28.535679 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6fc373f-0642-464e-81c9-b78a27dfebbe" containerName="keystone-bootstrap" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.535696 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6fc373f-0642-464e-81c9-b78a27dfebbe" containerName="keystone-bootstrap" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.535864 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6fc373f-0642-464e-81c9-b78a27dfebbe" containerName="keystone-bootstrap" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.536463 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.540104 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.540389 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.540587 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-btkr8" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.540902 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.541094 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.548837 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gcjrx"] Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.605979 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-config-data\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.606029 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-credential-keys\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.606059 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-scripts\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.606223 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-combined-ca-bundle\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.606258 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-fernet-keys\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.606286 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl8lm\" (UniqueName: \"kubernetes.io/projected/ba461ef4-49c1-4edc-ac60-1dfb91642c46-kube-api-access-wl8lm\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.707992 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-config-data\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.708052 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-credential-keys\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.708076 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-scripts\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.708171 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-combined-ca-bundle\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.708191 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-fernet-keys\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.708211 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl8lm\" (UniqueName: \"kubernetes.io/projected/ba461ef4-49c1-4edc-ac60-1dfb91642c46-kube-api-access-wl8lm\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.735308 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-scripts\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.735311 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-credential-keys\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.735358 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-fernet-keys\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.735433 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-config-data\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.735444 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-combined-ca-bundle\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.737984 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl8lm\" (UniqueName: \"kubernetes.io/projected/ba461ef4-49c1-4edc-ac60-1dfb91642c46-kube-api-access-wl8lm\") pod \"keystone-bootstrap-gcjrx\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:28 crc kubenswrapper[4775]: I0127 11:37:28.853156 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:29 crc kubenswrapper[4775]: I0127 11:37:29.756478 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6fc373f-0642-464e-81c9-b78a27dfebbe" path="/var/lib/kubelet/pods/e6fc373f-0642-464e-81c9-b78a27dfebbe/volumes" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.109648 4775 generic.go:334] "Generic (PLEG): container finished" podID="73aaf8f0-0380-4eff-875b-90da115dba37" containerID="99a5cb170850c0b63e27c950fae2217adb226000e7879b0d85d00d895a615bdf" exitCode=0 Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.109752 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-99pzl" event={"ID":"73aaf8f0-0380-4eff-875b-90da115dba37","Type":"ContainerDied","Data":"99a5cb170850c0b63e27c950fae2217adb226000e7879b0d85d00d895a615bdf"} Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.133502 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.133790 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:37:35 crc kubenswrapper[4775]: E0127 11:37:35.366186 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16" Jan 27 11:37:35 crc kubenswrapper[4775]: E0127 11:37:35.366351 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ftqgp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-2nfbz_openstack(0edaeaa2-aa90-484f-854c-db5dd181f61b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:37:35 crc kubenswrapper[4775]: E0127 11:37:35.367574 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-2nfbz" podUID="0edaeaa2-aa90-484f-854c-db5dd181f61b" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.482984 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.491201 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sd44h" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.562326 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgdvh\" (UniqueName: \"kubernetes.io/projected/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-kube-api-access-sgdvh\") pod \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563503 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-config-data\") pod \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563552 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-config\") pod \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563616 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-combined-ca-bundle\") pod \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563684 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-sb\") pod \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563746 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-db-sync-config-data\") pod \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\" (UID: \"ca5aab7c-3b7a-4996-82f5-478d4100bb6c\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563820 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-nb\") pod \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563852 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g82sd\" (UniqueName: \"kubernetes.io/projected/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-kube-api-access-g82sd\") pod \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.563875 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-dns-svc\") pod \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\" (UID: \"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8\") " Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.567999 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-kube-api-access-sgdvh" (OuterVolumeSpecName: "kube-api-access-sgdvh") pod "ca5aab7c-3b7a-4996-82f5-478d4100bb6c" (UID: "ca5aab7c-3b7a-4996-82f5-478d4100bb6c"). InnerVolumeSpecName "kube-api-access-sgdvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.568362 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ca5aab7c-3b7a-4996-82f5-478d4100bb6c" (UID: "ca5aab7c-3b7a-4996-82f5-478d4100bb6c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.569360 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-kube-api-access-g82sd" (OuterVolumeSpecName: "kube-api-access-g82sd") pod "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" (UID: "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8"). InnerVolumeSpecName "kube-api-access-g82sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.608267 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca5aab7c-3b7a-4996-82f5-478d4100bb6c" (UID: "ca5aab7c-3b7a-4996-82f5-478d4100bb6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.613015 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" (UID: "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.617168 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-config" (OuterVolumeSpecName: "config") pod "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" (UID: "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.622942 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" (UID: "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.623397 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-config-data" (OuterVolumeSpecName: "config-data") pod "ca5aab7c-3b7a-4996-82f5-478d4100bb6c" (UID: "ca5aab7c-3b7a-4996-82f5-478d4100bb6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.625886 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" (UID: "c24ee1fa-0d6a-4ca1-b298-d876f473f8f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666151 4775 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666188 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666202 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g82sd\" (UniqueName: \"kubernetes.io/projected/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-kube-api-access-g82sd\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666216 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666228 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgdvh\" (UniqueName: \"kubernetes.io/projected/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-kube-api-access-sgdvh\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666238 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666249 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666268 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca5aab7c-3b7a-4996-82f5-478d4100bb6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:35 crc kubenswrapper[4775]: I0127 11:37:35.666280 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.119281 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" event={"ID":"c24ee1fa-0d6a-4ca1-b298-d876f473f8f8","Type":"ContainerDied","Data":"28976e350fb8ecd8fa41a546d6bc48a308f3c35b6b458e7b2f0ad3f0838c3094"} Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.119323 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.119347 4775 scope.go:117] "RemoveContainer" containerID="5e752a3391827672fa37b60e71b5a6f3c1262d98795c1a40cb7662f381943f34" Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.125033 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-sd44h" event={"ID":"ca5aab7c-3b7a-4996-82f5-478d4100bb6c","Type":"ContainerDied","Data":"6240242f7a09936cfd2e2c9ff20e6303a6fa610f8151f73cb6a49267032567b6"} Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.125070 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6240242f7a09936cfd2e2c9ff20e6303a6fa610f8151f73cb6a49267032567b6" Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.125119 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-sd44h" Jan 27 11:37:36 crc kubenswrapper[4775]: E0127 11:37:36.127229 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:fe32d3ea620f0c7ecfdde9bbf28417fde03bc18c6f60b1408fa8da24d8188f16\\\"\"" pod="openstack/barbican-db-sync-2nfbz" podUID="0edaeaa2-aa90-484f-854c-db5dd181f61b" Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.160474 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-xrw7x"] Jan 27 11:37:36 crc kubenswrapper[4775]: I0127 11:37:36.171276 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6cb545bd4c-xrw7x"] Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:36.704888 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49" Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:36.705364 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h7xk8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-xbnrk_openstack(2029cc7b-c115-4c17-8713-c6eed291e963): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:36.706500 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-xbnrk" podUID="2029cc7b-c115-4c17-8713-c6eed291e963" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:36.812662 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:36.904942 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvd5r\" (UniqueName: \"kubernetes.io/projected/73aaf8f0-0380-4eff-875b-90da115dba37-kube-api-access-kvd5r\") pod \"73aaf8f0-0380-4eff-875b-90da115dba37\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:36.905168 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-config\") pod \"73aaf8f0-0380-4eff-875b-90da115dba37\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:36.905243 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-combined-ca-bundle\") pod \"73aaf8f0-0380-4eff-875b-90da115dba37\" (UID: \"73aaf8f0-0380-4eff-875b-90da115dba37\") " Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:36.942111 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73aaf8f0-0380-4eff-875b-90da115dba37-kube-api-access-kvd5r" (OuterVolumeSpecName: "kube-api-access-kvd5r") pod "73aaf8f0-0380-4eff-875b-90da115dba37" (UID: "73aaf8f0-0380-4eff-875b-90da115dba37"). InnerVolumeSpecName "kube-api-access-kvd5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:36.951201 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-config" (OuterVolumeSpecName: "config") pod "73aaf8f0-0380-4eff-875b-90da115dba37" (UID: "73aaf8f0-0380-4eff-875b-90da115dba37"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:36.981294 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "73aaf8f0-0380-4eff-875b-90da115dba37" (UID: "73aaf8f0-0380-4eff-875b-90da115dba37"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.012847 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.012873 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73aaf8f0-0380-4eff-875b-90da115dba37-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.012883 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvd5r\" (UniqueName: \"kubernetes.io/projected/73aaf8f0-0380-4eff-875b-90da115dba37-kube-api-access-kvd5r\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.085179 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-d6mrq"] Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:37.085761 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca5aab7c-3b7a-4996-82f5-478d4100bb6c" containerName="glance-db-sync" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.085775 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca5aab7c-3b7a-4996-82f5-478d4100bb6c" containerName="glance-db-sync" Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:37.085792 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="init" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.085797 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="init" Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:37.085816 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73aaf8f0-0380-4eff-875b-90da115dba37" containerName="neutron-db-sync" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.085823 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="73aaf8f0-0380-4eff-875b-90da115dba37" containerName="neutron-db-sync" Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:37.085832 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.085838 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.086038 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca5aab7c-3b7a-4996-82f5-478d4100bb6c" containerName="glance-db-sync" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.086052 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.086060 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="73aaf8f0-0380-4eff-875b-90da115dba37" containerName="neutron-db-sync" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.087042 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.103149 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-d6mrq"] Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.166282 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-99pzl" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.167046 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-99pzl" event={"ID":"73aaf8f0-0380-4eff-875b-90da115dba37","Type":"ContainerDied","Data":"86cd01583ba668ca9ee9332ef9ae7b46a7e4472e08a57756b024346b5439a7c6"} Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.167077 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86cd01583ba668ca9ee9332ef9ae7b46a7e4472e08a57756b024346b5439a7c6" Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:37.169268 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:b59b7445e581cc720038107e421371c86c5765b2967e77d884ef29b1d9fd0f49\\\"\"" pod="openstack/cinder-db-sync-xbnrk" podUID="2029cc7b-c115-4c17-8713-c6eed291e963" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.218797 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.219120 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.219149 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-config\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.219169 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz6sr\" (UniqueName: \"kubernetes.io/projected/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-kube-api-access-zz6sr\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.219185 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.219229 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.320557 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.320617 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.320647 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-config\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.320676 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz6sr\" (UniqueName: \"kubernetes.io/projected/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-kube-api-access-zz6sr\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.320691 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.320727 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.323193 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-svc\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.324894 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-sb\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.325767 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-nb\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.325850 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-config\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.325998 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-swift-storage-0\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.339193 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-d6mrq"] Jan 27 11:37:37 crc kubenswrapper[4775]: E0127 11:37:37.339809 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-zz6sr], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" podUID="957bd5b8-fe11-4f5e-b796-91f1ab9450c2" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.354130 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-7jpkg"] Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.356054 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz6sr\" (UniqueName: \"kubernetes.io/projected/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-kube-api-access-zz6sr\") pod \"dnsmasq-dns-5dc4fcdbc-d6mrq\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.356381 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.380756 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-7jpkg"] Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.422205 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.422256 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-svc\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.422341 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwjs5\" (UniqueName: \"kubernetes.io/projected/558b9501-01cb-43ac-aed0-f0cbc868ce59-kube-api-access-vwjs5\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.422369 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.422406 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-config\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.422428 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.469997 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-84666cddfd-6l8vq"] Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.487954 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-66f4cff584-s28fg"] Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.493740 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.499077 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.499331 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-rl8gp" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.499517 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.499665 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.516438 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66f4cff584-s28fg"] Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.525249 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.525283 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-svc\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.525330 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwjs5\" (UniqueName: \"kubernetes.io/projected/558b9501-01cb-43ac-aed0-f0cbc868ce59-kube-api-access-vwjs5\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.525353 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.525371 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-config\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.526462 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.526262 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-svc\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.526570 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-config\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.526111 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-nb\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.526577 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-sb\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.527083 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-swift-storage-0\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.548700 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwjs5\" (UniqueName: \"kubernetes.io/projected/558b9501-01cb-43ac-aed0-f0cbc868ce59-kube-api-access-vwjs5\") pod \"dnsmasq-dns-6b9c8b59c-7jpkg\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.630123 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-config\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.630171 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-ovndb-tls-certs\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.630203 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxk8k\" (UniqueName: \"kubernetes.io/projected/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-kube-api-access-xxk8k\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.630224 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-combined-ca-bundle\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.630493 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-httpd-config\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.694997 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.735494 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-config\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.735550 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-ovndb-tls-certs\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.735580 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxk8k\" (UniqueName: \"kubernetes.io/projected/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-kube-api-access-xxk8k\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.735606 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-combined-ca-bundle\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.735675 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-httpd-config\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.739938 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-combined-ca-bundle\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.739987 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-config\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.750332 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-ovndb-tls-certs\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.750366 4775 scope.go:117] "RemoveContainer" containerID="b5dc76210b8840ce4aa3ed6531d8e2c91e46aaffef6ddac900a9922372f2a92b" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.758470 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-httpd-config\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.759019 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxk8k\" (UniqueName: \"kubernetes.io/projected/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-kube-api-access-xxk8k\") pod \"neutron-66f4cff584-s28fg\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.769903 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" path="/var/lib/kubelet/pods/c24ee1fa-0d6a-4ca1-b298-d876f473f8f8/volumes" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.814157 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.961045 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.964703 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.968870 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.969830 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-ghp7c" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.970091 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 27 11:37:37 crc kubenswrapper[4775]: I0127 11:37:37.994782 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.046117 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.046192 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-config-data\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.046262 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-scripts\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.046326 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.046352 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjg8w\" (UniqueName: \"kubernetes.io/projected/18c65a1a-cace-450e-bd9d-b2f6824e6add-kube-api-access-kjg8w\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.046393 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-logs\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.046443 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.149722 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.149767 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-config-data\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.149820 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-scripts\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.149864 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.149880 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjg8w\" (UniqueName: \"kubernetes.io/projected/18c65a1a-cace-450e-bd9d-b2f6824e6add-kube-api-access-kjg8w\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.150019 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-logs\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.150061 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.153612 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.153641 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.153885 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-logs\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.156787 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.160145 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-config-data\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.166148 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-scripts\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.170524 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjg8w\" (UniqueName: \"kubernetes.io/projected/18c65a1a-cace-450e-bd9d-b2f6824e6add-kube-api-access-kjg8w\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.179176 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.181561 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.184188 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.191527 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.198089 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84666cddfd-6l8vq" event={"ID":"98c20582-df9c-4ed1-8c42-0d5d1783e6f4","Type":"ContainerStarted","Data":"67af1fcb0bcad60b4d6220dc2a58636c77413c902e1d5d58f9a296545b8c138a"} Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.209223 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.233215 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.236700 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.236607 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6546ffcc78-4zdnk"] Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.251895 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-logs\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.251939 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.251991 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.252095 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.252131 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcgj6\" (UniqueName: \"kubernetes.io/projected/722b4859-0679-4bb0-98eb-c4168101124e-kube-api-access-xcgj6\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.252200 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.252248 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.313927 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.322756 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-gcjrx"] Jan 27 11:37:38 crc kubenswrapper[4775]: W0127 11:37:38.344414 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba461ef4_49c1_4edc_ac60_1dfb91642c46.slice/crio-0cf9ab76fde0041be8ae70523fe40d1b2d1f81743365c3059afc3bdf84348843 WatchSource:0}: Error finding container 0cf9ab76fde0041be8ae70523fe40d1b2d1f81743365c3059afc3bdf84348843: Status 404 returned error can't find the container with id 0cf9ab76fde0041be8ae70523fe40d1b2d1f81743365c3059afc3bdf84348843 Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.353424 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz6sr\" (UniqueName: \"kubernetes.io/projected/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-kube-api-access-zz6sr\") pod \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.353490 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-nb\") pod \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.353519 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-config\") pod \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.353563 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-swift-storage-0\") pod \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.353678 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-svc\") pod \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.353722 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-sb\") pod \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\" (UID: \"957bd5b8-fe11-4f5e-b796-91f1ab9450c2\") " Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.354175 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-config" (OuterVolumeSpecName: "config") pod "957bd5b8-fe11-4f5e-b796-91f1ab9450c2" (UID: "957bd5b8-fe11-4f5e-b796-91f1ab9450c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.354552 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "957bd5b8-fe11-4f5e-b796-91f1ab9450c2" (UID: "957bd5b8-fe11-4f5e-b796-91f1ab9450c2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.354849 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "957bd5b8-fe11-4f5e-b796-91f1ab9450c2" (UID: "957bd5b8-fe11-4f5e-b796-91f1ab9450c2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356102 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "957bd5b8-fe11-4f5e-b796-91f1ab9450c2" (UID: "957bd5b8-fe11-4f5e-b796-91f1ab9450c2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356522 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "957bd5b8-fe11-4f5e-b796-91f1ab9450c2" (UID: "957bd5b8-fe11-4f5e-b796-91f1ab9450c2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356650 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-logs\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356679 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356738 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356779 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356796 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcgj6\" (UniqueName: \"kubernetes.io/projected/722b4859-0679-4bb0-98eb-c4168101124e-kube-api-access-xcgj6\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356830 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356854 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356937 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356947 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356958 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356966 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.356973 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.357067 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.357771 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-logs\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.358649 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.376933 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.385759 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-kube-api-access-zz6sr" (OuterVolumeSpecName: "kube-api-access-zz6sr") pod "957bd5b8-fe11-4f5e-b796-91f1ab9450c2" (UID: "957bd5b8-fe11-4f5e-b796-91f1ab9450c2"). InnerVolumeSpecName "kube-api-access-zz6sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.390375 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.391278 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.394109 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcgj6\" (UniqueName: \"kubernetes.io/projected/722b4859-0679-4bb0-98eb-c4168101124e-kube-api-access-xcgj6\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.465239 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz6sr\" (UniqueName: \"kubernetes.io/projected/957bd5b8-fe11-4f5e-b796-91f1ab9450c2-kube-api-access-zz6sr\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.482671 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-7jpkg"] Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.593299 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66f4cff584-s28fg"] Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.625710 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:38 crc kubenswrapper[4775]: I0127 11:37:38.830606 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.048112 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:39 crc kubenswrapper[4775]: W0127 11:37:39.072640 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18c65a1a_cace_450e_bd9d_b2f6824e6add.slice/crio-a3f179a7ac00e0532b27c38316f1ec93e3695e21e950a753c358b6a0f3438157 WatchSource:0}: Error finding container a3f179a7ac00e0532b27c38316f1ec93e3695e21e950a753c358b6a0f3438157: Status 404 returned error can't find the container with id a3f179a7ac00e0532b27c38316f1ec93e3695e21e950a753c358b6a0f3438157 Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.225785 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gcjrx" event={"ID":"ba461ef4-49c1-4edc-ac60-1dfb91642c46","Type":"ContainerStarted","Data":"0cf9ab76fde0041be8ae70523fe40d1b2d1f81743365c3059afc3bdf84348843"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.237069 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerStarted","Data":"089d2bc126411c7bc6665d485ed89d030e83e1513259c5c8f16328e6a4bd213e"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.253204 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18c65a1a-cace-450e-bd9d-b2f6824e6add","Type":"ContainerStarted","Data":"a3f179a7ac00e0532b27c38316f1ec93e3695e21e950a753c358b6a0f3438157"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.255967 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" event={"ID":"558b9501-01cb-43ac-aed0-f0cbc868ce59","Type":"ContainerStarted","Data":"f613d08fcd685ed44899c259a171ad733b3147458ae9f365bbc1e423524fcf00"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.261701 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cf66fb49-4l4kc" event={"ID":"c73cda8b-d244-4ad1-8f54-f5680565327d","Type":"ContainerStarted","Data":"5a7b8b818080f5556f5d65d07c2be8e6283d041522c2dd938c726bf295f59bde"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.262054 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58cf66fb49-4l4kc" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon-log" containerID="cri-o://5a7b8b818080f5556f5d65d07c2be8e6283d041522c2dd938c726bf295f59bde" gracePeriod=30 Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.262099 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-58cf66fb49-4l4kc" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon" containerID="cri-o://27965c735360621fc3e3960fb4bac6c83e5f074ce46fbbf9d72eadc3af3a359f" gracePeriod=30 Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.264847 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6546ffcc78-4zdnk" event={"ID":"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4","Type":"ContainerStarted","Data":"c1fec8af272f03a1a22bffcede83ebeeb34fe78f0c6c8f7e8812b5c385fd5e75"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.264878 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6546ffcc78-4zdnk" event={"ID":"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4","Type":"ContainerStarted","Data":"1605ce07b00108a41ae30e35ee9b929a02da1e4e7749dbe81b9666e441c5dc91"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.268841 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c78fd876f-8p4lr" event={"ID":"29a2a294-6d96-4169-9be8-7109251bf8b1","Type":"ContainerStarted","Data":"5506d184fe477b46386663b63596691c1993b133b8a155542ea5cad65532df49"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.268979 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c78fd876f-8p4lr" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon-log" containerID="cri-o://5506d184fe477b46386663b63596691c1993b133b8a155542ea5cad65532df49" gracePeriod=30 Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.269072 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c78fd876f-8p4lr" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon" containerID="cri-o://2ea013924b4f290fa084967e63882264b54bdf3e3f2ae5d4a85e13ca12cc197c" gracePeriod=30 Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.279204 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66f4cff584-s28fg" event={"ID":"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed","Type":"ContainerStarted","Data":"4940cda0a55ac3bfa8b35deb3e51723cf26072d3cd145374c8d469bfb275193d"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.279243 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66f4cff584-s28fg" event={"ID":"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed","Type":"ContainerStarted","Data":"98a20e3bbe057f1a1083416d0cff14282fdc9e2fca7261f4540fdf9a82145994"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.280544 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84666cddfd-6l8vq" event={"ID":"98c20582-df9c-4ed1-8c42-0d5d1783e6f4","Type":"ContainerStarted","Data":"b63cf0e89854369b83ebb263e9838c2cb8b2524c2ff119bacd1526747a2980ff"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.284254 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5dc4fcdbc-d6mrq" Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.284518 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6cd994f7-2jm86" event={"ID":"dd14daeb-9a49-4720-9c96-b6caf1257d5a","Type":"ContainerStarted","Data":"d741f03877ce7a29e41d06ab00c0d5e162e792f15a5fb3cb77d4cd2ce96127c2"} Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.290804 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-58cf66fb49-4l4kc" podStartSLOduration=3.2905261120000002 podStartE2EDuration="33.290790994s" podCreationTimestamp="2026-01-27 11:37:06 +0000 UTC" firstStartedPulling="2026-01-27 11:37:07.751026363 +0000 UTC m=+1006.892624150" lastFinishedPulling="2026-01-27 11:37:37.751291255 +0000 UTC m=+1036.892889032" observedRunningTime="2026-01-27 11:37:39.290353092 +0000 UTC m=+1038.431950859" watchObservedRunningTime="2026-01-27 11:37:39.290790994 +0000 UTC m=+1038.432388771" Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.333791 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-84666cddfd-6l8vq" podStartSLOduration=25.333774803 podStartE2EDuration="25.333774803s" podCreationTimestamp="2026-01-27 11:37:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:39.321121686 +0000 UTC m=+1038.462719473" watchObservedRunningTime="2026-01-27 11:37:39.333774803 +0000 UTC m=+1038.475372580" Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.377219 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c78fd876f-8p4lr" podStartSLOduration=4.092796103 podStartE2EDuration="33.377192873s" podCreationTimestamp="2026-01-27 11:37:06 +0000 UTC" firstStartedPulling="2026-01-27 11:37:07.368950875 +0000 UTC m=+1006.510548642" lastFinishedPulling="2026-01-27 11:37:36.653347635 +0000 UTC m=+1035.794945412" observedRunningTime="2026-01-27 11:37:39.372817333 +0000 UTC m=+1038.514415130" watchObservedRunningTime="2026-01-27 11:37:39.377192873 +0000 UTC m=+1038.518790650" Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.517262 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-d6mrq"] Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.547876 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5dc4fcdbc-d6mrq"] Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.638218 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:39 crc kubenswrapper[4775]: I0127 11:37:39.776556 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="957bd5b8-fe11-4f5e-b796-91f1ab9450c2" path="/var/lib/kubelet/pods/957bd5b8-fe11-4f5e-b796-91f1ab9450c2/volumes" Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.134187 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6cb545bd4c-xrw7x" podUID="c24ee1fa-0d6a-4ca1-b298-d876f473f8f8" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.117:5353: i/o timeout" Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.307338 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gcjrx" event={"ID":"ba461ef4-49c1-4edc-ac60-1dfb91642c46","Type":"ContainerStarted","Data":"d4146f8956305fcd5ed343f07c424f8688cf68dfdc28b629aab55c50f738bb32"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.318633 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"722b4859-0679-4bb0-98eb-c4168101124e","Type":"ContainerStarted","Data":"4bdedb28b55f515c534118238ae1a6d785ae6ac96c7d8f97a7f78f628958a487"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.324132 4775 generic.go:334] "Generic (PLEG): container finished" podID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerID="93626448b8ab20fd608cb51c7a09b76b9375b10a91e3ff2ab81efb1aa8fdb168" exitCode=0 Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.324403 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" event={"ID":"558b9501-01cb-43ac-aed0-f0cbc868ce59","Type":"ContainerDied","Data":"93626448b8ab20fd608cb51c7a09b76b9375b10a91e3ff2ab81efb1aa8fdb168"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.342145 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cf66fb49-4l4kc" event={"ID":"c73cda8b-d244-4ad1-8f54-f5680565327d","Type":"ContainerStarted","Data":"27965c735360621fc3e3960fb4bac6c83e5f074ce46fbbf9d72eadc3af3a359f"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.364328 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-gcjrx" podStartSLOduration=12.364312964 podStartE2EDuration="12.364312964s" podCreationTimestamp="2026-01-27 11:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:40.330583389 +0000 UTC m=+1039.472181166" watchObservedRunningTime="2026-01-27 11:37:40.364312964 +0000 UTC m=+1039.505910741" Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.364794 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84666cddfd-6l8vq" event={"ID":"98c20582-df9c-4ed1-8c42-0d5d1783e6f4","Type":"ContainerStarted","Data":"0eb18ea0a7e8522aa14ee450ec18f20609f48386c58320c99cc54df7dfbb3f2d"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.375897 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c78fd876f-8p4lr" event={"ID":"29a2a294-6d96-4169-9be8-7109251bf8b1","Type":"ContainerStarted","Data":"2ea013924b4f290fa084967e63882264b54bdf3e3f2ae5d4a85e13ca12cc197c"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.391555 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66f4cff584-s28fg" event={"ID":"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed","Type":"ContainerStarted","Data":"a7a6a0a041650648d435f425352e57c5d669972574c1edc44a04c82383216931"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.392063 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.406291 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18c65a1a-cace-450e-bd9d-b2f6824e6add","Type":"ContainerStarted","Data":"fde8c7d2735c59cd1b280864e83f75b8e5f7d802a5aa4ef9ca184716dbbfbcb2"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.419099 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-66f4cff584-s28fg" podStartSLOduration=3.419075425 podStartE2EDuration="3.419075425s" podCreationTimestamp="2026-01-27 11:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:40.411094646 +0000 UTC m=+1039.552692433" watchObservedRunningTime="2026-01-27 11:37:40.419075425 +0000 UTC m=+1039.560673202" Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.427668 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6546ffcc78-4zdnk" event={"ID":"00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4","Type":"ContainerStarted","Data":"9bee1aab4d317b8a9716f1db1b63ae74d2cad3853b202e1ae2748039624764ee"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.435660 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6cd994f7-2jm86" event={"ID":"dd14daeb-9a49-4720-9c96-b6caf1257d5a","Type":"ContainerStarted","Data":"69acbe0e1dbc2111ef595f05096451e17cc913c47831643c290c11171c0a8d99"} Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.435805 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5f6cd994f7-2jm86" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon-log" containerID="cri-o://d741f03877ce7a29e41d06ab00c0d5e162e792f15a5fb3cb77d4cd2ce96127c2" gracePeriod=30 Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.436044 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5f6cd994f7-2jm86" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon" containerID="cri-o://69acbe0e1dbc2111ef595f05096451e17cc913c47831643c290c11171c0a8d99" gracePeriod=30 Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.468441 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6546ffcc78-4zdnk" podStartSLOduration=26.468421738 podStartE2EDuration="26.468421738s" podCreationTimestamp="2026-01-27 11:37:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:40.461662733 +0000 UTC m=+1039.603260520" watchObservedRunningTime="2026-01-27 11:37:40.468421738 +0000 UTC m=+1039.610019515" Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.507886 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5f6cd994f7-2jm86" podStartSLOduration=5.2342435720000005 podStartE2EDuration="32.50786139s" podCreationTimestamp="2026-01-27 11:37:08 +0000 UTC" firstStartedPulling="2026-01-27 11:37:10.610110239 +0000 UTC m=+1009.751708016" lastFinishedPulling="2026-01-27 11:37:37.883728057 +0000 UTC m=+1037.025325834" observedRunningTime="2026-01-27 11:37:40.489620629 +0000 UTC m=+1039.631218426" watchObservedRunningTime="2026-01-27 11:37:40.50786139 +0000 UTC m=+1039.649459167" Jan 27 11:37:40 crc kubenswrapper[4775]: I0127 11:37:40.946562 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.035550 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.445556 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18c65a1a-cace-450e-bd9d-b2f6824e6add","Type":"ContainerStarted","Data":"f8224fd31f30d09352f5aea4baa31229da2f5e9e5507029c200c09cf75c989b1"} Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.445724 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-httpd" containerID="cri-o://f8224fd31f30d09352f5aea4baa31229da2f5e9e5507029c200c09cf75c989b1" gracePeriod=30 Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.445704 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-log" containerID="cri-o://fde8c7d2735c59cd1b280864e83f75b8e5f7d802a5aa4ef9ca184716dbbfbcb2" gracePeriod=30 Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.459989 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" event={"ID":"558b9501-01cb-43ac-aed0-f0cbc868ce59","Type":"ContainerStarted","Data":"a245340eb78d137ed3cb9c7df3352fab2464ec2b62b40355e4e4eb0fc55e898a"} Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.460142 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.466291 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"722b4859-0679-4bb0-98eb-c4168101124e","Type":"ContainerStarted","Data":"3779853a99e9f3e08be331ae752e4b12549efe927c88e8d16d89e0b55ff7fac6"} Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.479173 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.479157356 podStartE2EDuration="5.479157356s" podCreationTimestamp="2026-01-27 11:37:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:41.471900367 +0000 UTC m=+1040.613498164" watchObservedRunningTime="2026-01-27 11:37:41.479157356 +0000 UTC m=+1040.620755133" Jan 27 11:37:41 crc kubenswrapper[4775]: I0127 11:37:41.509074 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" podStartSLOduration=4.509054456 podStartE2EDuration="4.509054456s" podCreationTimestamp="2026-01-27 11:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:41.501764696 +0000 UTC m=+1040.643362493" watchObservedRunningTime="2026-01-27 11:37:41.509054456 +0000 UTC m=+1040.650652233" Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.478354 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"722b4859-0679-4bb0-98eb-c4168101124e","Type":"ContainerStarted","Data":"89b67f39524e6b44e39534c7099a066bd9bf7d085aefc4683f0121281aee95cd"} Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.478525 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-log" containerID="cri-o://3779853a99e9f3e08be331ae752e4b12549efe927c88e8d16d89e0b55ff7fac6" gracePeriod=30 Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.478551 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-httpd" containerID="cri-o://89b67f39524e6b44e39534c7099a066bd9bf7d085aefc4683f0121281aee95cd" gracePeriod=30 Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.481184 4775 generic.go:334] "Generic (PLEG): container finished" podID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerID="f8224fd31f30d09352f5aea4baa31229da2f5e9e5507029c200c09cf75c989b1" exitCode=0 Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.481226 4775 generic.go:334] "Generic (PLEG): container finished" podID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerID="fde8c7d2735c59cd1b280864e83f75b8e5f7d802a5aa4ef9ca184716dbbfbcb2" exitCode=143 Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.481558 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18c65a1a-cace-450e-bd9d-b2f6824e6add","Type":"ContainerDied","Data":"f8224fd31f30d09352f5aea4baa31229da2f5e9e5507029c200c09cf75c989b1"} Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.481598 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18c65a1a-cace-450e-bd9d-b2f6824e6add","Type":"ContainerDied","Data":"fde8c7d2735c59cd1b280864e83f75b8e5f7d802a5aa4ef9ca184716dbbfbcb2"} Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.504793 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.504774922 podStartE2EDuration="5.504774922s" podCreationTimestamp="2026-01-27 11:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:42.504108564 +0000 UTC m=+1041.645706361" watchObservedRunningTime="2026-01-27 11:37:42.504774922 +0000 UTC m=+1041.646372699" Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.902473 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-55b847b569-ccplz"] Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.904135 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.914735 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.915025 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 27 11:37:42 crc kubenswrapper[4775]: I0127 11:37:42.922058 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55b847b569-ccplz"] Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.000698 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvbbv\" (UniqueName: \"kubernetes.io/projected/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-kube-api-access-lvbbv\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.000763 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-ovndb-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.000833 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-httpd-config\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.000887 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-combined-ca-bundle\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.000931 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-public-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.000989 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-config\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.001036 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-internal-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.103068 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-config\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.103154 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-internal-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.103238 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvbbv\" (UniqueName: \"kubernetes.io/projected/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-kube-api-access-lvbbv\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.103267 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-ovndb-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.103302 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-httpd-config\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.103342 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-combined-ca-bundle\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.103388 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-public-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.109814 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-config\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.120035 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-combined-ca-bundle\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.123871 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-ovndb-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.126945 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-public-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.127289 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvbbv\" (UniqueName: \"kubernetes.io/projected/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-kube-api-access-lvbbv\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.136048 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-httpd-config\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.138569 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-internal-tls-certs\") pod \"neutron-55b847b569-ccplz\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.289961 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.535842 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.610097 4775 generic.go:334] "Generic (PLEG): container finished" podID="722b4859-0679-4bb0-98eb-c4168101124e" containerID="89b67f39524e6b44e39534c7099a066bd9bf7d085aefc4683f0121281aee95cd" exitCode=0 Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.610420 4775 generic.go:334] "Generic (PLEG): container finished" podID="722b4859-0679-4bb0-98eb-c4168101124e" containerID="3779853a99e9f3e08be331ae752e4b12549efe927c88e8d16d89e0b55ff7fac6" exitCode=143 Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.610513 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"722b4859-0679-4bb0-98eb-c4168101124e","Type":"ContainerDied","Data":"89b67f39524e6b44e39534c7099a066bd9bf7d085aefc4683f0121281aee95cd"} Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.610542 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"722b4859-0679-4bb0-98eb-c4168101124e","Type":"ContainerDied","Data":"3779853a99e9f3e08be331ae752e4b12549efe927c88e8d16d89e0b55ff7fac6"} Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.616057 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjg8w\" (UniqueName: \"kubernetes.io/projected/18c65a1a-cace-450e-bd9d-b2f6824e6add-kube-api-access-kjg8w\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.616096 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-config-data\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.616112 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.616171 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-httpd-run\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.616188 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-scripts\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.616234 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-logs\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.616285 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.619895 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-logs" (OuterVolumeSpecName: "logs") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.624281 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-scripts" (OuterVolumeSpecName: "scripts") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.625528 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.629609 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c65a1a-cace-450e-bd9d-b2f6824e6add-kube-api-access-kjg8w" (OuterVolumeSpecName: "kube-api-access-kjg8w") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "kube-api-access-kjg8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.638098 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.638221 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.638174 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"18c65a1a-cace-450e-bd9d-b2f6824e6add","Type":"ContainerDied","Data":"a3f179a7ac00e0532b27c38316f1ec93e3695e21e950a753c358b6a0f3438157"} Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.638341 4775 scope.go:117] "RemoveContainer" containerID="f8224fd31f30d09352f5aea4baa31229da2f5e9e5507029c200c09cf75c989b1" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.717778 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.718668 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle\") pod \"18c65a1a-cace-450e-bd9d-b2f6824e6add\" (UID: \"18c65a1a-cace-450e-bd9d-b2f6824e6add\") " Jan 27 11:37:43 crc kubenswrapper[4775]: W0127 11:37:43.718834 4775 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/18c65a1a-cace-450e-bd9d-b2f6824e6add/volumes/kubernetes.io~secret/combined-ca-bundle Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.718853 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.720752 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjg8w\" (UniqueName: \"kubernetes.io/projected/18c65a1a-cace-450e-bd9d-b2f6824e6add-kube-api-access-kjg8w\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.721046 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.721076 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.721088 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.721210 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/18c65a1a-cace-450e-bd9d-b2f6824e6add-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.721236 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.742648 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-config-data" (OuterVolumeSpecName: "config-data") pod "18c65a1a-cace-450e-bd9d-b2f6824e6add" (UID: "18c65a1a-cace-450e-bd9d-b2f6824e6add"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.745732 4775 scope.go:117] "RemoveContainer" containerID="fde8c7d2735c59cd1b280864e83f75b8e5f7d802a5aa4ef9ca184716dbbfbcb2" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.752582 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.770345 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.822761 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgj6\" (UniqueName: \"kubernetes.io/projected/722b4859-0679-4bb0-98eb-c4168101124e-kube-api-access-xcgj6\") pod \"722b4859-0679-4bb0-98eb-c4168101124e\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.823200 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-combined-ca-bundle\") pod \"722b4859-0679-4bb0-98eb-c4168101124e\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.823233 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"722b4859-0679-4bb0-98eb-c4168101124e\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.823279 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-httpd-run\") pod \"722b4859-0679-4bb0-98eb-c4168101124e\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.823337 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-scripts\") pod \"722b4859-0679-4bb0-98eb-c4168101124e\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.823402 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-logs\") pod \"722b4859-0679-4bb0-98eb-c4168101124e\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.823497 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-config-data\") pod \"722b4859-0679-4bb0-98eb-c4168101124e\" (UID: \"722b4859-0679-4bb0-98eb-c4168101124e\") " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.824317 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18c65a1a-cace-450e-bd9d-b2f6824e6add-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.824342 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.824941 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "722b4859-0679-4bb0-98eb-c4168101124e" (UID: "722b4859-0679-4bb0-98eb-c4168101124e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.826970 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-logs" (OuterVolumeSpecName: "logs") pod "722b4859-0679-4bb0-98eb-c4168101124e" (UID: "722b4859-0679-4bb0-98eb-c4168101124e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.829718 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/722b4859-0679-4bb0-98eb-c4168101124e-kube-api-access-xcgj6" (OuterVolumeSpecName: "kube-api-access-xcgj6") pod "722b4859-0679-4bb0-98eb-c4168101124e" (UID: "722b4859-0679-4bb0-98eb-c4168101124e"). InnerVolumeSpecName "kube-api-access-xcgj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.833761 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-scripts" (OuterVolumeSpecName: "scripts") pod "722b4859-0679-4bb0-98eb-c4168101124e" (UID: "722b4859-0679-4bb0-98eb-c4168101124e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.837947 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "722b4859-0679-4bb0-98eb-c4168101124e" (UID: "722b4859-0679-4bb0-98eb-c4168101124e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.860650 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "722b4859-0679-4bb0-98eb-c4168101124e" (UID: "722b4859-0679-4bb0-98eb-c4168101124e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.888811 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-config-data" (OuterVolumeSpecName: "config-data") pod "722b4859-0679-4bb0-98eb-c4168101124e" (UID: "722b4859-0679-4bb0-98eb-c4168101124e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.933113 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgj6\" (UniqueName: \"kubernetes.io/projected/722b4859-0679-4bb0-98eb-c4168101124e-kube-api-access-xcgj6\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.933149 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.933181 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.933191 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.933199 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.933208 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/722b4859-0679-4bb0-98eb-c4168101124e-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.933215 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/722b4859-0679-4bb0-98eb-c4168101124e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.980607 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.993462 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 27 11:37:43 crc kubenswrapper[4775]: I0127 11:37:43.995966 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.032773 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:44 crc kubenswrapper[4775]: E0127 11:37:44.033268 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-httpd" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033287 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-httpd" Jan 27 11:37:44 crc kubenswrapper[4775]: E0127 11:37:44.033321 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-log" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033331 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-log" Jan 27 11:37:44 crc kubenswrapper[4775]: E0127 11:37:44.033349 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-log" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033357 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-log" Jan 27 11:37:44 crc kubenswrapper[4775]: E0127 11:37:44.033386 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-httpd" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033397 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-httpd" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033687 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-log" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033720 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-httpd" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033741 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="722b4859-0679-4bb0-98eb-c4168101124e" containerName="glance-log" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.033757 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" containerName="glance-httpd" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.034931 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.035091 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.040287 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.040504 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.040673 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.132941 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-55b847b569-ccplz"] Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.136647 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.136722 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.136815 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2wkr\" (UniqueName: \"kubernetes.io/projected/b138b14c-964d-465d-a534-c7aff1633e76-kube-api-access-w2wkr\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.136854 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.136904 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-logs\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.136938 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-config-data\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.136972 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.137008 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-scripts\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242353 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2wkr\" (UniqueName: \"kubernetes.io/projected/b138b14c-964d-465d-a534-c7aff1633e76-kube-api-access-w2wkr\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242416 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-logs\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242504 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-config-data\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242531 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242562 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-scripts\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242614 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.242657 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.243836 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.244035 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-logs\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.249807 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.257292 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.257746 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-config-data\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.267004 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-scripts\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.267868 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.290285 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2wkr\" (UniqueName: \"kubernetes.io/projected/b138b14c-964d-465d-a534-c7aff1633e76-kube-api-access-w2wkr\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.306413 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.352675 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.672774 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-74wvb" event={"ID":"5c313125-cfde-424b-9bb3-acb232d20ba3","Type":"ContainerStarted","Data":"398c82449e605705da69d826d01f9e9fe98c4e413ef45b6f729de523bb9ad912"} Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.687882 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b847b569-ccplz" event={"ID":"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c","Type":"ContainerStarted","Data":"8b2a4356eb5f8df33ebc58ad0b94e8bc53209136a336f43ded79b5472757c90d"} Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.687937 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b847b569-ccplz" event={"ID":"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c","Type":"ContainerStarted","Data":"129e86fff0154f3e4de3082e715fe1284c270556711420ae01c9066fffafb3c8"} Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.700750 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"722b4859-0679-4bb0-98eb-c4168101124e","Type":"ContainerDied","Data":"4bdedb28b55f515c534118238ae1a6d785ae6ac96c7d8f97a7f78f628958a487"} Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.700809 4775 scope.go:117] "RemoveContainer" containerID="89b67f39524e6b44e39534c7099a066bd9bf7d085aefc4683f0121281aee95cd" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.700934 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.771800 4775 scope.go:117] "RemoveContainer" containerID="3779853a99e9f3e08be331ae752e4b12549efe927c88e8d16d89e0b55ff7fac6" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.784074 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-74wvb" podStartSLOduration=5.706032541 podStartE2EDuration="38.784039747s" podCreationTimestamp="2026-01-27 11:37:06 +0000 UTC" firstStartedPulling="2026-01-27 11:37:10.095633519 +0000 UTC m=+1009.237231296" lastFinishedPulling="2026-01-27 11:37:43.173640725 +0000 UTC m=+1042.315238502" observedRunningTime="2026-01-27 11:37:44.699581762 +0000 UTC m=+1043.841179559" watchObservedRunningTime="2026-01-27 11:37:44.784039747 +0000 UTC m=+1043.925637514" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.797560 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.827859 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.854320 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.856251 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.862032 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.862512 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.867242 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964406 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964504 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964551 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7wrh\" (UniqueName: \"kubernetes.io/projected/134ee9b9-bd65-48fb-9593-d0f29112e77e-kube-api-access-p7wrh\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964570 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964607 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-logs\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964627 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964648 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:44 crc kubenswrapper[4775]: I0127 11:37:44.964676 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.016397 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.066677 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067038 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7wrh\" (UniqueName: \"kubernetes.io/projected/134ee9b9-bd65-48fb-9593-d0f29112e77e-kube-api-access-p7wrh\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067072 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067126 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-logs\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067160 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067195 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067239 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067309 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.067307 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.068906 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-logs\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.068992 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.076307 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.085373 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.086292 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.093032 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.120024 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7wrh\" (UniqueName: \"kubernetes.io/projected/134ee9b9-bd65-48fb-9593-d0f29112e77e-kube-api-access-p7wrh\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.161200 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.181783 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.253814 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.253906 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.346548 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.346909 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.584555 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.711291 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134ee9b9-bd65-48fb-9593-d0f29112e77e","Type":"ContainerStarted","Data":"815ca40b27fb4cea044b33dd23bf33c1b082f912269530f93879da29eb229030"} Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.714202 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b847b569-ccplz" event={"ID":"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c","Type":"ContainerStarted","Data":"8e118e849fbf875dde2f05c2e98a8511d2d701c095eaa63e50b73abe199d91fe"} Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.714338 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.717903 4775 generic.go:334] "Generic (PLEG): container finished" podID="ba461ef4-49c1-4edc-ac60-1dfb91642c46" containerID="d4146f8956305fcd5ed343f07c424f8688cf68dfdc28b629aab55c50f738bb32" exitCode=0 Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.717949 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gcjrx" event={"ID":"ba461ef4-49c1-4edc-ac60-1dfb91642c46","Type":"ContainerDied","Data":"d4146f8956305fcd5ed343f07c424f8688cf68dfdc28b629aab55c50f738bb32"} Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.723820 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b138b14c-964d-465d-a534-c7aff1633e76","Type":"ContainerStarted","Data":"4be346d9744f80cbe9acdb090392b9c63c5e0cb6ed893fe6b3ae4a4e7c97ad5e"} Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.734490 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-55b847b569-ccplz" podStartSLOduration=3.734474472 podStartE2EDuration="3.734474472s" podCreationTimestamp="2026-01-27 11:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:45.73185288 +0000 UTC m=+1044.873450667" watchObservedRunningTime="2026-01-27 11:37:45.734474472 +0000 UTC m=+1044.876072249" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.756356 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18c65a1a-cace-450e-bd9d-b2f6824e6add" path="/var/lib/kubelet/pods/18c65a1a-cace-450e-bd9d-b2f6824e6add/volumes" Jan 27 11:37:45 crc kubenswrapper[4775]: I0127 11:37:45.762282 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="722b4859-0679-4bb0-98eb-c4168101124e" path="/var/lib/kubelet/pods/722b4859-0679-4bb0-98eb-c4168101124e/volumes" Jan 27 11:37:46 crc kubenswrapper[4775]: I0127 11:37:46.584997 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:37:46 crc kubenswrapper[4775]: I0127 11:37:46.745261 4775 generic.go:334] "Generic (PLEG): container finished" podID="5c313125-cfde-424b-9bb3-acb232d20ba3" containerID="398c82449e605705da69d826d01f9e9fe98c4e413ef45b6f729de523bb9ad912" exitCode=0 Jan 27 11:37:46 crc kubenswrapper[4775]: I0127 11:37:46.745348 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-74wvb" event={"ID":"5c313125-cfde-424b-9bb3-acb232d20ba3","Type":"ContainerDied","Data":"398c82449e605705da69d826d01f9e9fe98c4e413ef45b6f729de523bb9ad912"} Jan 27 11:37:46 crc kubenswrapper[4775]: I0127 11:37:46.749972 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b138b14c-964d-465d-a534-c7aff1633e76","Type":"ContainerStarted","Data":"164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b"} Jan 27 11:37:46 crc kubenswrapper[4775]: I0127 11:37:46.855202 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:37:47 crc kubenswrapper[4775]: I0127 11:37:47.696651 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:37:47 crc kubenswrapper[4775]: I0127 11:37:47.795263 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fd458c8cc-7vmm5"] Jan 27 11:37:47 crc kubenswrapper[4775]: I0127 11:37:47.795645 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" podUID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerName="dnsmasq-dns" containerID="cri-o://d3b7ef27bbcce0f78db6507da3665b881a0a8b58ddbc436efa6b111cc5cd68a1" gracePeriod=10 Jan 27 11:37:48 crc kubenswrapper[4775]: I0127 11:37:48.538530 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:37:48 crc kubenswrapper[4775]: I0127 11:37:48.819143 4775 generic.go:334] "Generic (PLEG): container finished" podID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerID="d3b7ef27bbcce0f78db6507da3665b881a0a8b58ddbc436efa6b111cc5cd68a1" exitCode=0 Jan 27 11:37:48 crc kubenswrapper[4775]: I0127 11:37:48.819188 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" event={"ID":"0276dc98-8972-465b-bf5a-e222c73eb8a0","Type":"ContainerDied","Data":"d3b7ef27bbcce0f78db6507da3665b881a0a8b58ddbc436efa6b111cc5cd68a1"} Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.830606 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-gcjrx" event={"ID":"ba461ef4-49c1-4edc-ac60-1dfb91642c46","Type":"ContainerDied","Data":"0cf9ab76fde0041be8ae70523fe40d1b2d1f81743365c3059afc3bdf84348843"} Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.830880 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cf9ab76fde0041be8ae70523fe40d1b2d1f81743365c3059afc3bdf84348843" Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.834599 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-74wvb" event={"ID":"5c313125-cfde-424b-9bb3-acb232d20ba3","Type":"ContainerDied","Data":"b0bc4a39609e848a771abb9f53ba789c3ab85ce7b53e0dfd4f329f9af932dba1"} Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.834631 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0bc4a39609e848a771abb9f53ba789c3ab85ce7b53e0dfd4f329f9af932dba1" Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.922464 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.959675 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.984260 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-combined-ca-bundle\") pod \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.984365 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-fernet-keys\") pod \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.984418 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl8lm\" (UniqueName: \"kubernetes.io/projected/ba461ef4-49c1-4edc-ac60-1dfb91642c46-kube-api-access-wl8lm\") pod \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.984476 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-credential-keys\") pod \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.984501 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-scripts\") pod \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " Jan 27 11:37:49 crc kubenswrapper[4775]: I0127 11:37:49.984552 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-config-data\") pod \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\" (UID: \"ba461ef4-49c1-4edc-ac60-1dfb91642c46\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.003197 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba461ef4-49c1-4edc-ac60-1dfb91642c46-kube-api-access-wl8lm" (OuterVolumeSpecName: "kube-api-access-wl8lm") pod "ba461ef4-49c1-4edc-ac60-1dfb91642c46" (UID: "ba461ef4-49c1-4edc-ac60-1dfb91642c46"). InnerVolumeSpecName "kube-api-access-wl8lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.012175 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ba461ef4-49c1-4edc-ac60-1dfb91642c46" (UID: "ba461ef4-49c1-4edc-ac60-1dfb91642c46"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.014597 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ba461ef4-49c1-4edc-ac60-1dfb91642c46" (UID: "ba461ef4-49c1-4edc-ac60-1dfb91642c46"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.025700 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-scripts" (OuterVolumeSpecName: "scripts") pod "ba461ef4-49c1-4edc-ac60-1dfb91642c46" (UID: "ba461ef4-49c1-4edc-ac60-1dfb91642c46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.039241 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-config-data" (OuterVolumeSpecName: "config-data") pod "ba461ef4-49c1-4edc-ac60-1dfb91642c46" (UID: "ba461ef4-49c1-4edc-ac60-1dfb91642c46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.039340 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba461ef4-49c1-4edc-ac60-1dfb91642c46" (UID: "ba461ef4-49c1-4edc-ac60-1dfb91642c46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.086490 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lmq8\" (UniqueName: \"kubernetes.io/projected/5c313125-cfde-424b-9bb3-acb232d20ba3-kube-api-access-9lmq8\") pod \"5c313125-cfde-424b-9bb3-acb232d20ba3\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.086594 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-combined-ca-bundle\") pod \"5c313125-cfde-424b-9bb3-acb232d20ba3\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.086663 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-scripts\") pod \"5c313125-cfde-424b-9bb3-acb232d20ba3\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.086738 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c313125-cfde-424b-9bb3-acb232d20ba3-logs\") pod \"5c313125-cfde-424b-9bb3-acb232d20ba3\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.086939 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-config-data\") pod \"5c313125-cfde-424b-9bb3-acb232d20ba3\" (UID: \"5c313125-cfde-424b-9bb3-acb232d20ba3\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.087287 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.087299 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.087309 4775 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.087317 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl8lm\" (UniqueName: \"kubernetes.io/projected/ba461ef4-49c1-4edc-ac60-1dfb91642c46-kube-api-access-wl8lm\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.087325 4775 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.087333 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba461ef4-49c1-4edc-ac60-1dfb91642c46-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.090335 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c313125-cfde-424b-9bb3-acb232d20ba3-logs" (OuterVolumeSpecName: "logs") pod "5c313125-cfde-424b-9bb3-acb232d20ba3" (UID: "5c313125-cfde-424b-9bb3-acb232d20ba3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.091513 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c313125-cfde-424b-9bb3-acb232d20ba3-kube-api-access-9lmq8" (OuterVolumeSpecName: "kube-api-access-9lmq8") pod "5c313125-cfde-424b-9bb3-acb232d20ba3" (UID: "5c313125-cfde-424b-9bb3-acb232d20ba3"). InnerVolumeSpecName "kube-api-access-9lmq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.096504 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-scripts" (OuterVolumeSpecName: "scripts") pod "5c313125-cfde-424b-9bb3-acb232d20ba3" (UID: "5c313125-cfde-424b-9bb3-acb232d20ba3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.122579 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c313125-cfde-424b-9bb3-acb232d20ba3" (UID: "5c313125-cfde-424b-9bb3-acb232d20ba3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.123927 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-config-data" (OuterVolumeSpecName: "config-data") pod "5c313125-cfde-424b-9bb3-acb232d20ba3" (UID: "5c313125-cfde-424b-9bb3-acb232d20ba3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.192194 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.192570 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lmq8\" (UniqueName: \"kubernetes.io/projected/5c313125-cfde-424b-9bb3-acb232d20ba3-kube-api-access-9lmq8\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.192583 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.192593 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c313125-cfde-424b-9bb3-acb232d20ba3-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.192624 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c313125-cfde-424b-9bb3-acb232d20ba3-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.421566 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.497301 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-524jt\" (UniqueName: \"kubernetes.io/projected/0276dc98-8972-465b-bf5a-e222c73eb8a0-kube-api-access-524jt\") pod \"0276dc98-8972-465b-bf5a-e222c73eb8a0\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.497381 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-config\") pod \"0276dc98-8972-465b-bf5a-e222c73eb8a0\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.497645 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-sb\") pod \"0276dc98-8972-465b-bf5a-e222c73eb8a0\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.497674 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-svc\") pod \"0276dc98-8972-465b-bf5a-e222c73eb8a0\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.497819 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-nb\") pod \"0276dc98-8972-465b-bf5a-e222c73eb8a0\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.497901 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-swift-storage-0\") pod \"0276dc98-8972-465b-bf5a-e222c73eb8a0\" (UID: \"0276dc98-8972-465b-bf5a-e222c73eb8a0\") " Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.514211 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0276dc98-8972-465b-bf5a-e222c73eb8a0-kube-api-access-524jt" (OuterVolumeSpecName: "kube-api-access-524jt") pod "0276dc98-8972-465b-bf5a-e222c73eb8a0" (UID: "0276dc98-8972-465b-bf5a-e222c73eb8a0"). InnerVolumeSpecName "kube-api-access-524jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.593824 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-config" (OuterVolumeSpecName: "config") pod "0276dc98-8972-465b-bf5a-e222c73eb8a0" (UID: "0276dc98-8972-465b-bf5a-e222c73eb8a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.600893 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-524jt\" (UniqueName: \"kubernetes.io/projected/0276dc98-8972-465b-bf5a-e222c73eb8a0-kube-api-access-524jt\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.600921 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.602278 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0276dc98-8972-465b-bf5a-e222c73eb8a0" (UID: "0276dc98-8972-465b-bf5a-e222c73eb8a0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.602913 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0276dc98-8972-465b-bf5a-e222c73eb8a0" (UID: "0276dc98-8972-465b-bf5a-e222c73eb8a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.607225 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0276dc98-8972-465b-bf5a-e222c73eb8a0" (UID: "0276dc98-8972-465b-bf5a-e222c73eb8a0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.607344 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0276dc98-8972-465b-bf5a-e222c73eb8a0" (UID: "0276dc98-8972-465b-bf5a-e222c73eb8a0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.702214 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.702583 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.702678 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.702713 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0276dc98-8972-465b-bf5a-e222c73eb8a0-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.845874 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-74wvb" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.845861 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" event={"ID":"0276dc98-8972-465b-bf5a-e222c73eb8a0","Type":"ContainerDied","Data":"a084fdcf587e56beb20131ec45402a052d2991a945867b6c6e9adfa05c842c39"} Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.849535 4775 scope.go:117] "RemoveContainer" containerID="d3b7ef27bbcce0f78db6507da3665b881a0a8b58ddbc436efa6b111cc5cd68a1" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.845925 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fd458c8cc-7vmm5" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.845970 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-gcjrx" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.892508 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fd458c8cc-7vmm5"] Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.899612 4775 scope.go:117] "RemoveContainer" containerID="de00e87fa01e98a6d0ad8af61db692885f2ec794526f456407919d9326501ccd" Jan 27 11:37:50 crc kubenswrapper[4775]: I0127 11:37:50.899819 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fd458c8cc-7vmm5"] Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.114870 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5994598694-dhq5v"] Jan 27 11:37:51 crc kubenswrapper[4775]: E0127 11:37:51.115616 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c313125-cfde-424b-9bb3-acb232d20ba3" containerName="placement-db-sync" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.115635 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c313125-cfde-424b-9bb3-acb232d20ba3" containerName="placement-db-sync" Jan 27 11:37:51 crc kubenswrapper[4775]: E0127 11:37:51.115661 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerName="init" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.115668 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerName="init" Jan 27 11:37:51 crc kubenswrapper[4775]: E0127 11:37:51.115676 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba461ef4-49c1-4edc-ac60-1dfb91642c46" containerName="keystone-bootstrap" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.115683 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba461ef4-49c1-4edc-ac60-1dfb91642c46" containerName="keystone-bootstrap" Jan 27 11:37:51 crc kubenswrapper[4775]: E0127 11:37:51.115702 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerName="dnsmasq-dns" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.115709 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerName="dnsmasq-dns" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.115902 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0276dc98-8972-465b-bf5a-e222c73eb8a0" containerName="dnsmasq-dns" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.115947 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c313125-cfde-424b-9bb3-acb232d20ba3" containerName="placement-db-sync" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.115959 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba461ef4-49c1-4edc-ac60-1dfb91642c46" containerName="keystone-bootstrap" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.116593 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.122280 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.122548 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.122685 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.122970 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.139897 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.140240 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-btkr8" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.153701 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5994598694-dhq5v"] Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219002 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-fernet-keys\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219098 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-public-tls-certs\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219128 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7wrg\" (UniqueName: \"kubernetes.io/projected/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-kube-api-access-h7wrg\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219205 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-scripts\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219242 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-credential-keys\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219281 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-combined-ca-bundle\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219306 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-internal-tls-certs\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.219555 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-config-data\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.263932 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6b9b59fc66-t6rbl"] Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.266504 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.276656 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7dndl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.277596 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.279218 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.279470 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.279545 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.281418 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b9b59fc66-t6rbl"] Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.321711 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-scripts\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.321774 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-config-data\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.321814 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-config-data\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.321888 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/926c665f-b922-4372-85aa-bbe29399eaac-logs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.321938 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-fernet-keys\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.321981 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-public-tls-certs\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322029 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7wrg\" (UniqueName: \"kubernetes.io/projected/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-kube-api-access-h7wrg\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322053 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jspwx\" (UniqueName: \"kubernetes.io/projected/926c665f-b922-4372-85aa-bbe29399eaac-kube-api-access-jspwx\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322119 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-scripts\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322172 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-credential-keys\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322249 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-combined-ca-bundle\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322285 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-combined-ca-bundle\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322332 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-internal-tls-certs\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322418 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-public-tls-certs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.322492 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-internal-tls-certs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.335259 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-credential-keys\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.335791 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-public-tls-certs\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.337127 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-fernet-keys\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.339801 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-scripts\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.341564 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-internal-tls-certs\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.342126 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-combined-ca-bundle\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.344677 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-config-data\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.362014 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7wrg\" (UniqueName: \"kubernetes.io/projected/94f53f42-a5fc-45f9-b94c-4f12b63d8d75-kube-api-access-h7wrg\") pod \"keystone-5994598694-dhq5v\" (UID: \"94f53f42-a5fc-45f9-b94c-4f12b63d8d75\") " pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.423988 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-public-tls-certs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.424060 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-internal-tls-certs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.424107 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-scripts\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.424131 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-config-data\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.424172 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/926c665f-b922-4372-85aa-bbe29399eaac-logs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.424203 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jspwx\" (UniqueName: \"kubernetes.io/projected/926c665f-b922-4372-85aa-bbe29399eaac-kube-api-access-jspwx\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.424240 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-combined-ca-bundle\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.429187 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-combined-ca-bundle\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.429809 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/926c665f-b922-4372-85aa-bbe29399eaac-logs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.433175 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-scripts\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.438620 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-public-tls-certs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.440197 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-internal-tls-certs\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.441277 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-config-data\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.465202 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jspwx\" (UniqueName: \"kubernetes.io/projected/926c665f-b922-4372-85aa-bbe29399eaac-kube-api-access-jspwx\") pod \"placement-6b9b59fc66-t6rbl\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.547052 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.643899 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.812044 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0276dc98-8972-465b-bf5a-e222c73eb8a0" path="/var/lib/kubelet/pods/0276dc98-8972-465b-bf5a-e222c73eb8a0/volumes" Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.901353 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5994598694-dhq5v"] Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.904260 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134ee9b9-bd65-48fb-9593-d0f29112e77e","Type":"ContainerStarted","Data":"8ece19255413b1f459b9b434879cd49c181c9d1e505f96017ef83628747fdd1b"} Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.906532 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerStarted","Data":"5244001eb3a13f0c4abc67276bce40ec6973ea3761d765924e030142c43bc5b5"} Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.934593 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b138b14c-964d-465d-a534-c7aff1633e76","Type":"ContainerStarted","Data":"1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b"} Jan 27 11:37:51 crc kubenswrapper[4775]: I0127 11:37:51.968911 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=8.968888902 podStartE2EDuration="8.968888902s" podCreationTimestamp="2026-01-27 11:37:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:51.958095525 +0000 UTC m=+1051.099693312" watchObservedRunningTime="2026-01-27 11:37:51.968888902 +0000 UTC m=+1051.110486679" Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.271854 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b9b59fc66-t6rbl"] Jan 27 11:37:52 crc kubenswrapper[4775]: W0127 11:37:52.282953 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod926c665f_b922_4372_85aa_bbe29399eaac.slice/crio-0fb58f98d42cc735e9a9f8ee52d9b3e8b27d110f1502a0148df7a0c3e74615b7 WatchSource:0}: Error finding container 0fb58f98d42cc735e9a9f8ee52d9b3e8b27d110f1502a0148df7a0c3e74615b7: Status 404 returned error can't find the container with id 0fb58f98d42cc735e9a9f8ee52d9b3e8b27d110f1502a0148df7a0c3e74615b7 Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.968071 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b9b59fc66-t6rbl" event={"ID":"926c665f-b922-4372-85aa-bbe29399eaac","Type":"ContainerStarted","Data":"174033676be0775ea3975296e01fba15ad5de44d5394f6325f82a1a3f89deda7"} Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.968580 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b9b59fc66-t6rbl" event={"ID":"926c665f-b922-4372-85aa-bbe29399eaac","Type":"ContainerStarted","Data":"c8e562dcd249e68b0060406f3b2394c8239c0b9654b1e64e4b6a4b3e8e23ca84"} Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.968590 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b9b59fc66-t6rbl" event={"ID":"926c665f-b922-4372-85aa-bbe29399eaac","Type":"ContainerStarted","Data":"0fb58f98d42cc735e9a9f8ee52d9b3e8b27d110f1502a0148df7a0c3e74615b7"} Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.968612 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.968627 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.970929 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2nfbz" event={"ID":"0edaeaa2-aa90-484f-854c-db5dd181f61b","Type":"ContainerStarted","Data":"1a9f2ed09821cb7a2fc3a6a56f74a7c65b7d39b4dfff4c1c07be78b154a6894c"} Jan 27 11:37:52 crc kubenswrapper[4775]: I0127 11:37:52.979108 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134ee9b9-bd65-48fb-9593-d0f29112e77e","Type":"ContainerStarted","Data":"2f5a6906cc8f471f0d04ad0bdc4a6f5a9284f2bae71c74883779afada2270d60"} Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.005284 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6b9b59fc66-t6rbl" podStartSLOduration=2.005263183 podStartE2EDuration="2.005263183s" podCreationTimestamp="2026-01-27 11:37:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:52.994878548 +0000 UTC m=+1052.136476325" watchObservedRunningTime="2026-01-27 11:37:53.005263183 +0000 UTC m=+1052.146860960" Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.012629 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xbnrk" event={"ID":"2029cc7b-c115-4c17-8713-c6eed291e963","Type":"ContainerStarted","Data":"41709560e0a135bfad172581c43697731478b69553f5d48646b5f6b88ba2d017"} Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.022811 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-2nfbz" podStartSLOduration=2.449535328 podStartE2EDuration="47.022793214s" podCreationTimestamp="2026-01-27 11:37:06 +0000 UTC" firstStartedPulling="2026-01-27 11:37:07.688422935 +0000 UTC m=+1006.830020712" lastFinishedPulling="2026-01-27 11:37:52.261680821 +0000 UTC m=+1051.403278598" observedRunningTime="2026-01-27 11:37:53.014775664 +0000 UTC m=+1052.156373431" watchObservedRunningTime="2026-01-27 11:37:53.022793214 +0000 UTC m=+1052.164390991" Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.029500 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5994598694-dhq5v" event={"ID":"94f53f42-a5fc-45f9-b94c-4f12b63d8d75","Type":"ContainerStarted","Data":"f4e267e8e6c46c7ea5135342bbd56e5f4d8c0dc885b0f451523f5713bcaf56fb"} Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.029532 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5994598694-dhq5v" event={"ID":"94f53f42-a5fc-45f9-b94c-4f12b63d8d75","Type":"ContainerStarted","Data":"7bf24c9591bfe20f8b8f6d29ed33805940d457f0f72bd837d83bf7d002869247"} Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.029571 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.055752 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=9.055733077 podStartE2EDuration="9.055733077s" podCreationTimestamp="2026-01-27 11:37:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:53.039240534 +0000 UTC m=+1052.180838331" watchObservedRunningTime="2026-01-27 11:37:53.055733077 +0000 UTC m=+1052.197330854" Jan 27 11:37:53 crc kubenswrapper[4775]: I0127 11:37:53.070383 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-xbnrk" podStartSLOduration=3.110992017 podStartE2EDuration="47.070365787s" podCreationTimestamp="2026-01-27 11:37:06 +0000 UTC" firstStartedPulling="2026-01-27 11:37:07.383117263 +0000 UTC m=+1006.524715040" lastFinishedPulling="2026-01-27 11:37:51.342491033 +0000 UTC m=+1050.484088810" observedRunningTime="2026-01-27 11:37:53.063837799 +0000 UTC m=+1052.205435576" watchObservedRunningTime="2026-01-27 11:37:53.070365787 +0000 UTC m=+1052.211963554" Jan 27 11:37:54 crc kubenswrapper[4775]: I0127 11:37:54.353510 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 27 11:37:54 crc kubenswrapper[4775]: I0127 11:37:54.353597 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 27 11:37:54 crc kubenswrapper[4775]: I0127 11:37:54.397680 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 27 11:37:54 crc kubenswrapper[4775]: I0127 11:37:54.400345 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 27 11:37:54 crc kubenswrapper[4775]: I0127 11:37:54.429220 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5994598694-dhq5v" podStartSLOduration=3.429195592 podStartE2EDuration="3.429195592s" podCreationTimestamp="2026-01-27 11:37:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:37:53.086564082 +0000 UTC m=+1052.228161859" watchObservedRunningTime="2026-01-27 11:37:54.429195592 +0000 UTC m=+1053.570793369" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.045373 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.045410 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.182135 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.182509 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.221081 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.221898 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.255589 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 27 11:37:55 crc kubenswrapper[4775]: I0127 11:37:55.349039 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6546ffcc78-4zdnk" podUID="00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Jan 27 11:37:56 crc kubenswrapper[4775]: I0127 11:37:56.058662 4775 generic.go:334] "Generic (PLEG): container finished" podID="0edaeaa2-aa90-484f-854c-db5dd181f61b" containerID="1a9f2ed09821cb7a2fc3a6a56f74a7c65b7d39b4dfff4c1c07be78b154a6894c" exitCode=0 Jan 27 11:37:56 crc kubenswrapper[4775]: I0127 11:37:56.058773 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2nfbz" event={"ID":"0edaeaa2-aa90-484f-854c-db5dd181f61b","Type":"ContainerDied","Data":"1a9f2ed09821cb7a2fc3a6a56f74a7c65b7d39b4dfff4c1c07be78b154a6894c"} Jan 27 11:37:56 crc kubenswrapper[4775]: I0127 11:37:56.060331 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:56 crc kubenswrapper[4775]: I0127 11:37:56.060363 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:57 crc kubenswrapper[4775]: I0127 11:37:57.029396 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 27 11:37:57 crc kubenswrapper[4775]: I0127 11:37:57.070630 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:37:57 crc kubenswrapper[4775]: I0127 11:37:57.186877 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 27 11:37:58 crc kubenswrapper[4775]: I0127 11:37:58.086747 4775 generic.go:334] "Generic (PLEG): container finished" podID="2029cc7b-c115-4c17-8713-c6eed291e963" containerID="41709560e0a135bfad172581c43697731478b69553f5d48646b5f6b88ba2d017" exitCode=0 Jan 27 11:37:58 crc kubenswrapper[4775]: I0127 11:37:58.086869 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xbnrk" event={"ID":"2029cc7b-c115-4c17-8713-c6eed291e963","Type":"ContainerDied","Data":"41709560e0a135bfad172581c43697731478b69553f5d48646b5f6b88ba2d017"} Jan 27 11:37:58 crc kubenswrapper[4775]: I0127 11:37:58.877126 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.764494 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.775267 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.806522 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-combined-ca-bundle\") pod \"0edaeaa2-aa90-484f-854c-db5dd181f61b\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.806678 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-db-sync-config-data\") pod \"0edaeaa2-aa90-484f-854c-db5dd181f61b\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.806746 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftqgp\" (UniqueName: \"kubernetes.io/projected/0edaeaa2-aa90-484f-854c-db5dd181f61b-kube-api-access-ftqgp\") pod \"0edaeaa2-aa90-484f-854c-db5dd181f61b\" (UID: \"0edaeaa2-aa90-484f-854c-db5dd181f61b\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.828303 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0edaeaa2-aa90-484f-854c-db5dd181f61b-kube-api-access-ftqgp" (OuterVolumeSpecName: "kube-api-access-ftqgp") pod "0edaeaa2-aa90-484f-854c-db5dd181f61b" (UID: "0edaeaa2-aa90-484f-854c-db5dd181f61b"). InnerVolumeSpecName "kube-api-access-ftqgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.828487 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0edaeaa2-aa90-484f-854c-db5dd181f61b" (UID: "0edaeaa2-aa90-484f-854c-db5dd181f61b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.863145 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0edaeaa2-aa90-484f-854c-db5dd181f61b" (UID: "0edaeaa2-aa90-484f-854c-db5dd181f61b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.907781 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-db-sync-config-data\") pod \"2029cc7b-c115-4c17-8713-c6eed291e963\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.907818 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2029cc7b-c115-4c17-8713-c6eed291e963-etc-machine-id\") pod \"2029cc7b-c115-4c17-8713-c6eed291e963\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.907896 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-combined-ca-bundle\") pod \"2029cc7b-c115-4c17-8713-c6eed291e963\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.907927 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-config-data\") pod \"2029cc7b-c115-4c17-8713-c6eed291e963\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.907975 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7xk8\" (UniqueName: \"kubernetes.io/projected/2029cc7b-c115-4c17-8713-c6eed291e963-kube-api-access-h7xk8\") pod \"2029cc7b-c115-4c17-8713-c6eed291e963\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.908092 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-scripts\") pod \"2029cc7b-c115-4c17-8713-c6eed291e963\" (UID: \"2029cc7b-c115-4c17-8713-c6eed291e963\") " Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.908231 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2029cc7b-c115-4c17-8713-c6eed291e963-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2029cc7b-c115-4c17-8713-c6eed291e963" (UID: "2029cc7b-c115-4c17-8713-c6eed291e963"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.908501 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftqgp\" (UniqueName: \"kubernetes.io/projected/0edaeaa2-aa90-484f-854c-db5dd181f61b-kube-api-access-ftqgp\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.908513 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.908521 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2029cc7b-c115-4c17-8713-c6eed291e963-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.908551 4775 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0edaeaa2-aa90-484f-854c-db5dd181f61b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.911920 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-scripts" (OuterVolumeSpecName: "scripts") pod "2029cc7b-c115-4c17-8713-c6eed291e963" (UID: "2029cc7b-c115-4c17-8713-c6eed291e963"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.913027 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2029cc7b-c115-4c17-8713-c6eed291e963" (UID: "2029cc7b-c115-4c17-8713-c6eed291e963"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.915176 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2029cc7b-c115-4c17-8713-c6eed291e963-kube-api-access-h7xk8" (OuterVolumeSpecName: "kube-api-access-h7xk8") pod "2029cc7b-c115-4c17-8713-c6eed291e963" (UID: "2029cc7b-c115-4c17-8713-c6eed291e963"). InnerVolumeSpecName "kube-api-access-h7xk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.932033 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2029cc7b-c115-4c17-8713-c6eed291e963" (UID: "2029cc7b-c115-4c17-8713-c6eed291e963"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:37:59 crc kubenswrapper[4775]: I0127 11:37:59.962656 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-config-data" (OuterVolumeSpecName: "config-data") pod "2029cc7b-c115-4c17-8713-c6eed291e963" (UID: "2029cc7b-c115-4c17-8713-c6eed291e963"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.010878 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.010912 4775 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.010923 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.010931 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2029cc7b-c115-4c17-8713-c6eed291e963-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.010939 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7xk8\" (UniqueName: \"kubernetes.io/projected/2029cc7b-c115-4c17-8713-c6eed291e963-kube-api-access-h7xk8\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.059051 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.144084 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-2nfbz" event={"ID":"0edaeaa2-aa90-484f-854c-db5dd181f61b","Type":"ContainerDied","Data":"9170c8f0fe1b93f735c76c15f9a93fc8d92b886973d63e04084aa00a5cbc88dd"} Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.144119 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9170c8f0fe1b93f735c76c15f9a93fc8d92b886973d63e04084aa00a5cbc88dd" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.144167 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-2nfbz" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.158769 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-xbnrk" event={"ID":"2029cc7b-c115-4c17-8713-c6eed291e963","Type":"ContainerDied","Data":"3bc9015e48f89109be48fe8277a72545dd42d19ee96ca2b3cb7712694284f3b0"} Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.159139 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bc9015e48f89109be48fe8277a72545dd42d19ee96ca2b3cb7712694284f3b0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.159207 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-xbnrk" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.567257 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:00 crc kubenswrapper[4775]: E0127 11:38:00.567865 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2029cc7b-c115-4c17-8713-c6eed291e963" containerName="cinder-db-sync" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.567877 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2029cc7b-c115-4c17-8713-c6eed291e963" containerName="cinder-db-sync" Jan 27 11:38:00 crc kubenswrapper[4775]: E0127 11:38:00.567891 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0edaeaa2-aa90-484f-854c-db5dd181f61b" containerName="barbican-db-sync" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.567897 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0edaeaa2-aa90-484f-854c-db5dd181f61b" containerName="barbican-db-sync" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.568034 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0edaeaa2-aa90-484f-854c-db5dd181f61b" containerName="barbican-db-sync" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.568049 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2029cc7b-c115-4c17-8713-c6eed291e963" containerName="cinder-db-sync" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.569304 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.572682 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-dtgzl" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.573092 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.573265 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.578507 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.608134 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.623352 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.623409 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.623535 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.623577 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.623596 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-scripts\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.623668 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvtx5\" (UniqueName: \"kubernetes.io/projected/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-kube-api-access-xvtx5\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.663753 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9f5756c4f-s4q7z"] Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.677023 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9f5756c4f-s4q7z"] Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.677136 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724670 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvtx5\" (UniqueName: \"kubernetes.io/projected/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-kube-api-access-xvtx5\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724711 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnrkn\" (UniqueName: \"kubernetes.io/projected/53358000-8708-4b14-9f75-49ae61de192c-kube-api-access-cnrkn\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724761 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724779 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724795 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-swift-storage-0\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724821 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-svc\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724847 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-nb\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724893 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-config\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724923 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724946 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724963 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-scripts\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.724993 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-sb\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.730137 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.743485 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.751697 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvtx5\" (UniqueName: \"kubernetes.io/projected/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-kube-api-access-xvtx5\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.752591 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.752978 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-scripts\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.773262 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.773346 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.777523 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.782046 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.790241 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831103 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-sb\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831170 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-scripts\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831222 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnrkn\" (UniqueName: \"kubernetes.io/projected/53358000-8708-4b14-9f75-49ae61de192c-kube-api-access-cnrkn\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831239 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831288 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-swift-storage-0\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831333 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-svc\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831366 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831389 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831430 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-nb\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831488 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpt4z\" (UniqueName: \"kubernetes.io/projected/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-kube-api-access-xpt4z\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831529 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-logs\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831568 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-config\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.831610 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.832601 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-sb\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.833118 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-nb\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.833578 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-swift-storage-0\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.833826 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-config\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.834101 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-svc\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.853267 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnrkn\" (UniqueName: \"kubernetes.io/projected/53358000-8708-4b14-9f75-49ae61de192c-kube-api-access-cnrkn\") pod \"dnsmasq-dns-9f5756c4f-s4q7z\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.900325 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.910629 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.932797 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpt4z\" (UniqueName: \"kubernetes.io/projected/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-kube-api-access-xpt4z\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.932843 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-logs\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.932895 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.932952 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-scripts\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.932995 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.933054 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.933069 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.933385 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-etc-machine-id\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.933416 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-logs\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.938797 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.941062 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data-custom\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.942099 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.954893 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-scripts\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:00 crc kubenswrapper[4775]: I0127 11:38:00.961322 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpt4z\" (UniqueName: \"kubernetes.io/projected/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-kube-api-access-xpt4z\") pod \"cinder-api-0\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " pod="openstack/cinder-api-0" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.014533 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-695f7dfd45-zbb58"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.016142 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.023698 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.027704 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-4b27z" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.029849 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.087581 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-667698bbc6-zpl9x"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.123625 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-695f7dfd45-zbb58"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.123777 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.137129 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.137403 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh92h\" (UniqueName: \"kubernetes.io/projected/ac6a9582-6a97-46b4-aa84-35ca9abe695c-kube-api-access-hh92h\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.137480 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.137514 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac6a9582-6a97-46b4-aa84-35ca9abe695c-logs\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.137588 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data-custom\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.137660 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-combined-ca-bundle\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.156633 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-667698bbc6-zpl9x"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.219845 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.236416 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9f5756c4f-s4q7z"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.240969 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data-custom\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241023 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data-custom\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241064 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjdqm\" (UniqueName: \"kubernetes.io/projected/ca1756aa-c8c1-4f8e-9871-05e044a80c84-kube-api-access-zjdqm\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241090 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-combined-ca-bundle\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241107 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca1756aa-c8c1-4f8e-9871-05e044a80c84-logs\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241138 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-combined-ca-bundle\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241155 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh92h\" (UniqueName: \"kubernetes.io/projected/ac6a9582-6a97-46b4-aa84-35ca9abe695c-kube-api-access-hh92h\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241183 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241211 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241232 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac6a9582-6a97-46b4-aa84-35ca9abe695c-logs\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.241593 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac6a9582-6a97-46b4-aa84-35ca9abe695c-logs\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.244516 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-2kvdd"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.245894 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.255697 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-2kvdd"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.263832 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-combined-ca-bundle\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.271552 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data-custom\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.272841 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.276780 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh92h\" (UniqueName: \"kubernetes.io/projected/ac6a9582-6a97-46b4-aa84-35ca9abe695c-kube-api-access-hh92h\") pod \"barbican-worker-695f7dfd45-zbb58\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.279109 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7696d8466d-w52tt"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.280983 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.299629 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.304618 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7696d8466d-w52tt"] Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345062 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6lcx\" (UniqueName: \"kubernetes.io/projected/91668934-529e-4df9-b41f-8cd54e5920ea-kube-api-access-q6lcx\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345106 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca1756aa-c8c1-4f8e-9871-05e044a80c84-logs\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345151 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-combined-ca-bundle\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345284 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345349 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data-custom\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345410 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345523 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837199be-1d46-4982-93ee-3f28a585d1d0-logs\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345569 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-combined-ca-bundle\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345609 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjd8t\" (UniqueName: \"kubernetes.io/projected/837199be-1d46-4982-93ee-3f28a585d1d0-kube-api-access-xjd8t\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345682 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345723 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.345773 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.347728 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data-custom\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.347819 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-config\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.347875 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjdqm\" (UniqueName: \"kubernetes.io/projected/ca1756aa-c8c1-4f8e-9871-05e044a80c84-kube-api-access-zjdqm\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.347906 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.349926 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca1756aa-c8c1-4f8e-9871-05e044a80c84-logs\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.353887 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.365332 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data-custom\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.368418 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjdqm\" (UniqueName: \"kubernetes.io/projected/ca1756aa-c8c1-4f8e-9871-05e044a80c84-kube-api-access-zjdqm\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.377789 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-combined-ca-bundle\") pod \"barbican-keystone-listener-667698bbc6-zpl9x\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.378174 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450190 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450234 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data-custom\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450283 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837199be-1d46-4982-93ee-3f28a585d1d0-logs\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450305 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-combined-ca-bundle\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450327 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjd8t\" (UniqueName: \"kubernetes.io/projected/837199be-1d46-4982-93ee-3f28a585d1d0-kube-api-access-xjd8t\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450359 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450375 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450400 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450436 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-config\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450482 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.450507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6lcx\" (UniqueName: \"kubernetes.io/projected/91668934-529e-4df9-b41f-8cd54e5920ea-kube-api-access-q6lcx\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.451966 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-swift-storage-0\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.452010 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-config\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.452805 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-svc\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.453123 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837199be-1d46-4982-93ee-3f28a585d1d0-logs\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.454226 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data-custom\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.454680 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.456623 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-sb\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.457111 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-nb\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.457174 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-combined-ca-bundle\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.471960 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.478241 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6lcx\" (UniqueName: \"kubernetes.io/projected/91668934-529e-4df9-b41f-8cd54e5920ea-kube-api-access-q6lcx\") pod \"dnsmasq-dns-75bfc9b94f-2kvdd\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.480058 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjd8t\" (UniqueName: \"kubernetes.io/projected/837199be-1d46-4982-93ee-3f28a585d1d0-kube-api-access-xjd8t\") pod \"barbican-api-7696d8466d-w52tt\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.589222 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:01 crc kubenswrapper[4775]: I0127 11:38:01.642658 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:02 crc kubenswrapper[4775]: I0127 11:38:02.451249 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:03 crc kubenswrapper[4775]: E0127 11:38:03.339322 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="8a82d041-4b07-491a-8af6-232e67a23299" Jan 27 11:38:03 crc kubenswrapper[4775]: I0127 11:38:03.796248 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-695f7dfd45-zbb58"] Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.074862 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-667698bbc6-zpl9x"] Jan 27 11:38:04 crc kubenswrapper[4775]: W0127 11:38:04.078195 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48cf8210_a82a_4a2c_9c3f_b4f28cc3e4b2.slice/crio-03fcbf6ca88140ee0c7c54aff2f27534dbded1c1f0d7ef78fa4f4153e2db46f4 WatchSource:0}: Error finding container 03fcbf6ca88140ee0c7c54aff2f27534dbded1c1f0d7ef78fa4f4153e2db46f4: Status 404 returned error can't find the container with id 03fcbf6ca88140ee0c7c54aff2f27534dbded1c1f0d7ef78fa4f4153e2db46f4 Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.088985 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.101750 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9f5756c4f-s4q7z"] Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.111121 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7696d8466d-w52tt"] Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.118510 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.131932 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-2kvdd"] Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.221394 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" event={"ID":"ca1756aa-c8c1-4f8e-9871-05e044a80c84","Type":"ContainerStarted","Data":"7725e0d31cab8fdd988ddc82ff5c6e00f8aac8edb67890b0869f5c2b5c515d21"} Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.223064 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7696d8466d-w52tt" event={"ID":"837199be-1d46-4982-93ee-3f28a585d1d0","Type":"ContainerStarted","Data":"0d2d8798bfd1e1511045000c7dea13845346445c7084205b6f8006dd91903bbd"} Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.224234 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" event={"ID":"53358000-8708-4b14-9f75-49ae61de192c","Type":"ContainerStarted","Data":"eb38c8a0c3a6789c718c68d92bcb1866aef03b97c45dbc45b7c10e4d35714637"} Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.226721 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" event={"ID":"91668934-529e-4df9-b41f-8cd54e5920ea","Type":"ContainerStarted","Data":"98a47029353e8ac81c34e8a77e13a6ae144436ae57c8cc4cc8ecca40c93dad8a"} Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.228760 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="ceilometer-notification-agent" containerID="cri-o://089d2bc126411c7bc6665d485ed89d030e83e1513259c5c8f16328e6a4bd213e" gracePeriod=30 Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.228858 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="proxy-httpd" containerID="cri-o://443a1b23fe2193180d98684045f0c5460c5490556325375373577c0a10fc76b2" gracePeriod=30 Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.228896 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="sg-core" containerID="cri-o://5244001eb3a13f0c4abc67276bce40ec6973ea3761d765924e030142c43bc5b5" gracePeriod=30 Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.228763 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerStarted","Data":"443a1b23fe2193180d98684045f0c5460c5490556325375373577c0a10fc76b2"} Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.229134 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.233087 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-695f7dfd45-zbb58" event={"ID":"ac6a9582-6a97-46b4-aa84-35ca9abe695c","Type":"ContainerStarted","Data":"f81b70a5029c4a6796c226030165299ba18cc9b31e7b37fbe5cd06acf314b976"} Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.234273 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be","Type":"ContainerStarted","Data":"2c4d9c2cc89971c922a946b8d57e98b6524b909e64d2d0876060a57a1644a6f7"} Jan 27 11:38:04 crc kubenswrapper[4775]: I0127 11:38:04.235572 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2","Type":"ContainerStarted","Data":"03fcbf6ca88140ee0c7c54aff2f27534dbded1c1f0d7ef78fa4f4153e2db46f4"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.252041 4775 generic.go:334] "Generic (PLEG): container finished" podID="8a82d041-4b07-491a-8af6-232e67a23299" containerID="443a1b23fe2193180d98684045f0c5460c5490556325375373577c0a10fc76b2" exitCode=0 Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.252647 4775 generic.go:334] "Generic (PLEG): container finished" podID="8a82d041-4b07-491a-8af6-232e67a23299" containerID="5244001eb3a13f0c4abc67276bce40ec6973ea3761d765924e030142c43bc5b5" exitCode=2 Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.252750 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerDied","Data":"443a1b23fe2193180d98684045f0c5460c5490556325375373577c0a10fc76b2"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.252808 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerDied","Data":"5244001eb3a13f0c4abc67276bce40ec6973ea3761d765924e030142c43bc5b5"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.254126 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.255092 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be","Type":"ContainerStarted","Data":"d95672c7202b1212bf2392f51b8192ddd3370d76dac1053d37c2f0bc490e15b0"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.256966 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7696d8466d-w52tt" event={"ID":"837199be-1d46-4982-93ee-3f28a585d1d0","Type":"ContainerStarted","Data":"f6b1261e70bbd30706bfbb925d2178443a048afc57cb4157e9e1e03777faecb2"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.256995 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7696d8466d-w52tt" event={"ID":"837199be-1d46-4982-93ee-3f28a585d1d0","Type":"ContainerStarted","Data":"e8609affd7b82a5a8a2b23b648cb0dca487cea8d7f1754f41f4cbf90181492f0"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.257055 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.257264 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.264721 4775 generic.go:334] "Generic (PLEG): container finished" podID="53358000-8708-4b14-9f75-49ae61de192c" containerID="e80601e59ac96d3876f9dd39d7ba994c4c403ba5d862cc1049da791ec0bb87d3" exitCode=0 Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.264962 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" event={"ID":"53358000-8708-4b14-9f75-49ae61de192c","Type":"ContainerDied","Data":"e80601e59ac96d3876f9dd39d7ba994c4c403ba5d862cc1049da791ec0bb87d3"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.266986 4775 generic.go:334] "Generic (PLEG): container finished" podID="91668934-529e-4df9-b41f-8cd54e5920ea" containerID="88473ae1a8fc90fa959a314a4a49d93772825f6cd05e1adb0fc249904b937add" exitCode=0 Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.267014 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" event={"ID":"91668934-529e-4df9-b41f-8cd54e5920ea","Type":"ContainerDied","Data":"88473ae1a8fc90fa959a314a4a49d93772825f6cd05e1adb0fc249904b937add"} Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.302851 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7696d8466d-w52tt" podStartSLOduration=4.302832105 podStartE2EDuration="4.302832105s" podCreationTimestamp="2026-01-27 11:38:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:05.278968711 +0000 UTC m=+1064.420566498" watchObservedRunningTime="2026-01-27 11:38:05.302832105 +0000 UTC m=+1064.444429882" Jan 27 11:38:05 crc kubenswrapper[4775]: I0127 11:38:05.347525 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6546ffcc78-4zdnk" podUID="00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.145:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.145:8443: connect: connection refused" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.106795 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.151506 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-swift-storage-0\") pod \"53358000-8708-4b14-9f75-49ae61de192c\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.151676 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-svc\") pod \"53358000-8708-4b14-9f75-49ae61de192c\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.151811 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-nb\") pod \"53358000-8708-4b14-9f75-49ae61de192c\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.151869 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnrkn\" (UniqueName: \"kubernetes.io/projected/53358000-8708-4b14-9f75-49ae61de192c-kube-api-access-cnrkn\") pod \"53358000-8708-4b14-9f75-49ae61de192c\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.151972 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-sb\") pod \"53358000-8708-4b14-9f75-49ae61de192c\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.152012 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-config\") pod \"53358000-8708-4b14-9f75-49ae61de192c\" (UID: \"53358000-8708-4b14-9f75-49ae61de192c\") " Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.177413 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53358000-8708-4b14-9f75-49ae61de192c-kube-api-access-cnrkn" (OuterVolumeSpecName: "kube-api-access-cnrkn") pod "53358000-8708-4b14-9f75-49ae61de192c" (UID: "53358000-8708-4b14-9f75-49ae61de192c"). InnerVolumeSpecName "kube-api-access-cnrkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.180377 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "53358000-8708-4b14-9f75-49ae61de192c" (UID: "53358000-8708-4b14-9f75-49ae61de192c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.206375 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-config" (OuterVolumeSpecName: "config") pod "53358000-8708-4b14-9f75-49ae61de192c" (UID: "53358000-8708-4b14-9f75-49ae61de192c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.209979 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "53358000-8708-4b14-9f75-49ae61de192c" (UID: "53358000-8708-4b14-9f75-49ae61de192c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.227714 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "53358000-8708-4b14-9f75-49ae61de192c" (UID: "53358000-8708-4b14-9f75-49ae61de192c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.238637 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "53358000-8708-4b14-9f75-49ae61de192c" (UID: "53358000-8708-4b14-9f75-49ae61de192c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.257367 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.257701 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnrkn\" (UniqueName: \"kubernetes.io/projected/53358000-8708-4b14-9f75-49ae61de192c-kube-api-access-cnrkn\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.257713 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.257722 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.257731 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.257739 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/53358000-8708-4b14-9f75-49ae61de192c-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.302369 4775 generic.go:334] "Generic (PLEG): container finished" podID="8a82d041-4b07-491a-8af6-232e67a23299" containerID="089d2bc126411c7bc6665d485ed89d030e83e1513259c5c8f16328e6a4bd213e" exitCode=0 Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.302438 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerDied","Data":"089d2bc126411c7bc6665d485ed89d030e83e1513259c5c8f16328e6a4bd213e"} Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.312918 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be","Type":"ContainerStarted","Data":"7095637e9396e1da41094b8f13a5d4acfb0cb246f0bca43d3c480928763afee1"} Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.313047 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api-log" containerID="cri-o://d95672c7202b1212bf2392f51b8192ddd3370d76dac1053d37c2f0bc490e15b0" gracePeriod=30 Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.313084 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api" containerID="cri-o://7095637e9396e1da41094b8f13a5d4acfb0cb246f0bca43d3c480928763afee1" gracePeriod=30 Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.313190 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.335600 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.336400 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9f5756c4f-s4q7z" event={"ID":"53358000-8708-4b14-9f75-49ae61de192c","Type":"ContainerDied","Data":"eb38c8a0c3a6789c718c68d92bcb1866aef03b97c45dbc45b7c10e4d35714637"} Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.336471 4775 scope.go:117] "RemoveContainer" containerID="e80601e59ac96d3876f9dd39d7ba994c4c403ba5d862cc1049da791ec0bb87d3" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.358323 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.35828949 podStartE2EDuration="6.35828949s" podCreationTimestamp="2026-01-27 11:38:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:06.337672834 +0000 UTC m=+1065.479270611" watchObservedRunningTime="2026-01-27 11:38:06.35828949 +0000 UTC m=+1065.499887267" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.462870 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9f5756c4f-s4q7z"] Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.504344 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-8bc6678d8-674l9"] Jan 27 11:38:06 crc kubenswrapper[4775]: E0127 11:38:06.504789 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53358000-8708-4b14-9f75-49ae61de192c" containerName="init" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.504820 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="53358000-8708-4b14-9f75-49ae61de192c" containerName="init" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.505034 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="53358000-8708-4b14-9f75-49ae61de192c" containerName="init" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.506020 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9f5756c4f-s4q7z"] Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.506118 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.508724 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.508993 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.528650 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8bc6678d8-674l9"] Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.581567 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-public-tls-certs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.581616 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsdsn\" (UniqueName: \"kubernetes.io/projected/59717e39-e3c7-40b2-89c7-7b898f3b72e7-kube-api-access-jsdsn\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.581636 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.581671 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-internal-tls-certs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.581702 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-combined-ca-bundle\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.581743 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data-custom\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.581765 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59717e39-e3c7-40b2-89c7-7b898f3b72e7-logs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.684504 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-combined-ca-bundle\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.684590 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data-custom\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.684611 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59717e39-e3c7-40b2-89c7-7b898f3b72e7-logs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.684690 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-public-tls-certs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.684710 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsdsn\" (UniqueName: \"kubernetes.io/projected/59717e39-e3c7-40b2-89c7-7b898f3b72e7-kube-api-access-jsdsn\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.684730 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.684763 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-internal-tls-certs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.688059 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59717e39-e3c7-40b2-89c7-7b898f3b72e7-logs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.695951 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-internal-tls-certs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.695959 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data-custom\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.697131 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-combined-ca-bundle\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.698409 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.703892 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-public-tls-certs\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.712374 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsdsn\" (UniqueName: \"kubernetes.io/projected/59717e39-e3c7-40b2-89c7-7b898f3b72e7-kube-api-access-jsdsn\") pod \"barbican-api-8bc6678d8-674l9\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:06 crc kubenswrapper[4775]: I0127 11:38:06.849349 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.345041 4775 generic.go:334] "Generic (PLEG): container finished" podID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerID="7095637e9396e1da41094b8f13a5d4acfb0cb246f0bca43d3c480928763afee1" exitCode=0 Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.345078 4775 generic.go:334] "Generic (PLEG): container finished" podID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerID="d95672c7202b1212bf2392f51b8192ddd3370d76dac1053d37c2f0bc490e15b0" exitCode=143 Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.345098 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be","Type":"ContainerDied","Data":"7095637e9396e1da41094b8f13a5d4acfb0cb246f0bca43d3c480928763afee1"} Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.345124 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be","Type":"ContainerDied","Data":"d95672c7202b1212bf2392f51b8192ddd3370d76dac1053d37c2f0bc490e15b0"} Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.754893 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53358000-8708-4b14-9f75-49ae61de192c" path="/var/lib/kubelet/pods/53358000-8708-4b14-9f75-49ae61de192c/volumes" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.793230 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.828642 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.905316 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-log-httpd\") pod \"8a82d041-4b07-491a-8af6-232e67a23299\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.905531 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5n55\" (UniqueName: \"kubernetes.io/projected/8a82d041-4b07-491a-8af6-232e67a23299-kube-api-access-z5n55\") pod \"8a82d041-4b07-491a-8af6-232e67a23299\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.905563 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-scripts\") pod \"8a82d041-4b07-491a-8af6-232e67a23299\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.906414 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-combined-ca-bundle\") pod \"8a82d041-4b07-491a-8af6-232e67a23299\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.906575 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-run-httpd\") pod \"8a82d041-4b07-491a-8af6-232e67a23299\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.906683 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-config-data\") pod \"8a82d041-4b07-491a-8af6-232e67a23299\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.906741 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-sg-core-conf-yaml\") pod \"8a82d041-4b07-491a-8af6-232e67a23299\" (UID: \"8a82d041-4b07-491a-8af6-232e67a23299\") " Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.907017 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8a82d041-4b07-491a-8af6-232e67a23299" (UID: "8a82d041-4b07-491a-8af6-232e67a23299"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.909151 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.909178 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8a82d041-4b07-491a-8af6-232e67a23299" (UID: "8a82d041-4b07-491a-8af6-232e67a23299"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.912887 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a82d041-4b07-491a-8af6-232e67a23299-kube-api-access-z5n55" (OuterVolumeSpecName: "kube-api-access-z5n55") pod "8a82d041-4b07-491a-8af6-232e67a23299" (UID: "8a82d041-4b07-491a-8af6-232e67a23299"). InnerVolumeSpecName "kube-api-access-z5n55". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.919067 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-scripts" (OuterVolumeSpecName: "scripts") pod "8a82d041-4b07-491a-8af6-232e67a23299" (UID: "8a82d041-4b07-491a-8af6-232e67a23299"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:07 crc kubenswrapper[4775]: I0127 11:38:07.963314 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8a82d041-4b07-491a-8af6-232e67a23299" (UID: "8a82d041-4b07-491a-8af6-232e67a23299"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.015017 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.015260 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a82d041-4b07-491a-8af6-232e67a23299-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.015340 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5n55\" (UniqueName: \"kubernetes.io/projected/8a82d041-4b07-491a-8af6-232e67a23299-kube-api-access-z5n55\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.015416 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.021677 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a82d041-4b07-491a-8af6-232e67a23299" (UID: "8a82d041-4b07-491a-8af6-232e67a23299"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.062684 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-config-data" (OuterVolumeSpecName: "config-data") pod "8a82d041-4b07-491a-8af6-232e67a23299" (UID: "8a82d041-4b07-491a-8af6-232e67a23299"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.120250 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.120288 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a82d041-4b07-491a-8af6-232e67a23299-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.154441 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55b847b569-ccplz"] Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.154680 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55b847b569-ccplz" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-api" containerID="cri-o://8b2a4356eb5f8df33ebc58ad0b94e8bc53209136a336f43ded79b5472757c90d" gracePeriod=30 Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.155288 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-55b847b569-ccplz" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-httpd" containerID="cri-o://8e118e849fbf875dde2f05c2e98a8511d2d701c095eaa63e50b73abe199d91fe" gracePeriod=30 Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.185924 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f57cbf767-xvk7k"] Jan 27 11:38:08 crc kubenswrapper[4775]: E0127 11:38:08.186409 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="proxy-httpd" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.186429 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="proxy-httpd" Jan 27 11:38:08 crc kubenswrapper[4775]: E0127 11:38:08.186482 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="sg-core" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.186493 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="sg-core" Jan 27 11:38:08 crc kubenswrapper[4775]: E0127 11:38:08.186512 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="ceilometer-notification-agent" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.186521 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="ceilometer-notification-agent" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.186770 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="ceilometer-notification-agent" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.186805 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="sg-core" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.186818 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a82d041-4b07-491a-8af6-232e67a23299" containerName="proxy-httpd" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.188079 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.197846 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f57cbf767-xvk7k"] Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.262699 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-55b847b569-ccplz" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.152:9696/\": read tcp 10.217.0.2:45210->10.217.0.152:9696: read: connection reset by peer" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.323598 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-public-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.323974 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-combined-ca-bundle\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.324104 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-httpd-config\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.324286 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-config\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.324369 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-ovndb-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.324458 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rp82\" (UniqueName: \"kubernetes.io/projected/17e205ad-6676-4f5d-b9d0-0d8c958d815d-kube-api-access-2rp82\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.324553 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-internal-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.368792 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" event={"ID":"91668934-529e-4df9-b41f-8cd54e5920ea","Type":"ContainerStarted","Data":"a0e92df054ede73072c8816014c71d3028937fc797e7a11e419afbd459f2f615"} Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.368864 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.389802 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a82d041-4b07-491a-8af6-232e67a23299","Type":"ContainerDied","Data":"a1da85b3df4788f571e86de3391158e11cf2502b74702f3be38ea8d5b9dea0f2"} Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.389832 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.389851 4775 scope.go:117] "RemoveContainer" containerID="443a1b23fe2193180d98684045f0c5460c5490556325375373577c0a10fc76b2" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.410019 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-695f7dfd45-zbb58" event={"ID":"ac6a9582-6a97-46b4-aa84-35ca9abe695c","Type":"ContainerStarted","Data":"156c73760afe4bfaf528d085e9a2fb00e063fb27928a61dc8179d4c23fd740db"} Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.420830 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.423147 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be","Type":"ContainerDied","Data":"2c4d9c2cc89971c922a946b8d57e98b6524b909e64d2d0876060a57a1644a6f7"} Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.427419 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-config\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.427498 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-ovndb-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.427523 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rp82\" (UniqueName: \"kubernetes.io/projected/17e205ad-6676-4f5d-b9d0-0d8c958d815d-kube-api-access-2rp82\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.427590 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-internal-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.428071 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-public-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.428112 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-combined-ca-bundle\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.428209 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-httpd-config\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.436111 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-public-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.437268 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-httpd-config\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.437603 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-ovndb-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.437753 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" podStartSLOduration=7.437734375 podStartE2EDuration="7.437734375s" podCreationTimestamp="2026-01-27 11:38:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:08.389217824 +0000 UTC m=+1067.530815621" watchObservedRunningTime="2026-01-27 11:38:08.437734375 +0000 UTC m=+1067.579332152" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.443524 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-combined-ca-bundle\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.444034 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-config\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.452953 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-internal-tls-certs\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.464241 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rp82\" (UniqueName: \"kubernetes.io/projected/17e205ad-6676-4f5d-b9d0-0d8c958d815d-kube-api-access-2rp82\") pod \"neutron-6f57cbf767-xvk7k\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.469637 4775 scope.go:117] "RemoveContainer" containerID="5244001eb3a13f0c4abc67276bce40ec6973ea3761d765924e030142c43bc5b5" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.506576 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.514844 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.522596 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:08 crc kubenswrapper[4775]: E0127 11:38:08.523098 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.523118 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api" Jan 27 11:38:08 crc kubenswrapper[4775]: E0127 11:38:08.523136 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api-log" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.523142 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api-log" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.523369 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api-log" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.523408 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" containerName="cinder-api" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.530337 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-combined-ca-bundle\") pod \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.530437 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data\") pod \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.530480 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-logs\") pod \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.530506 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-etc-machine-id\") pod \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.530561 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data-custom\") pod \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.530667 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpt4z\" (UniqueName: \"kubernetes.io/projected/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-kube-api-access-xpt4z\") pod \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.530730 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-scripts\") pod \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\" (UID: \"e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be\") " Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.532591 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" (UID: "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.535871 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-logs" (OuterVolumeSpecName: "logs") pod "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" (UID: "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.537661 4775 scope.go:117] "RemoveContainer" containerID="089d2bc126411c7bc6665d485ed89d030e83e1513259c5c8f16328e6a4bd213e" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.543544 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" (UID: "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.549075 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-scripts" (OuterVolumeSpecName: "scripts") pod "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" (UID: "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.550357 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-kube-api-access-xpt4z" (OuterVolumeSpecName: "kube-api-access-xpt4z") pod "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" (UID: "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be"). InnerVolumeSpecName "kube-api-access-xpt4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.563794 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.563975 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.567383 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8bc6678d8-674l9"] Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.568929 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.569111 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.589733 4775 scope.go:117] "RemoveContainer" containerID="7095637e9396e1da41094b8f13a5d4acfb0cb246f0bca43d3c480928763afee1" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.633764 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.633807 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.633837 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-run-httpd\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.633853 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-config-data\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.633901 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-log-httpd\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.633918 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-scripts\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.633942 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5szj\" (UniqueName: \"kubernetes.io/projected/f43a36d6-24df-43c5-9d20-aaa35c11f855-kube-api-access-t5szj\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.634012 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpt4z\" (UniqueName: \"kubernetes.io/projected/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-kube-api-access-xpt4z\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.634023 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.634031 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.634039 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.634047 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.634845 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" (UID: "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.645848 4775 scope.go:117] "RemoveContainer" containerID="d95672c7202b1212bf2392f51b8192ddd3370d76dac1053d37c2f0bc490e15b0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.647762 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data" (OuterVolumeSpecName: "config-data") pod "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" (UID: "e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.720704 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735371 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-log-httpd\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735422 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-scripts\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5szj\" (UniqueName: \"kubernetes.io/projected/f43a36d6-24df-43c5-9d20-aaa35c11f855-kube-api-access-t5szj\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735571 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735596 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735620 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-run-httpd\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735634 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-config-data\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735701 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735714 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.735974 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-log-httpd\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.737396 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-run-httpd\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.741851 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-config-data\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.742156 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.744072 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-scripts\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.746690 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.755739 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5szj\" (UniqueName: \"kubernetes.io/projected/f43a36d6-24df-43c5-9d20-aaa35c11f855-kube-api-access-t5szj\") pod \"ceilometer-0\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " pod="openstack/ceilometer-0" Jan 27 11:38:08 crc kubenswrapper[4775]: I0127 11:38:08.897166 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.263530 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f57cbf767-xvk7k"] Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.439234 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2","Type":"ContainerStarted","Data":"93c0e1e738416356c2758621400d93df83887d8dd15b0d587e3b64d7e4898cf8"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.442406 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8bc6678d8-674l9" event={"ID":"59717e39-e3c7-40b2-89c7-7b898f3b72e7","Type":"ContainerStarted","Data":"88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.442457 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8bc6678d8-674l9" event={"ID":"59717e39-e3c7-40b2-89c7-7b898f3b72e7","Type":"ContainerStarted","Data":"9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.442468 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8bc6678d8-674l9" event={"ID":"59717e39-e3c7-40b2-89c7-7b898f3b72e7","Type":"ContainerStarted","Data":"7741a0906f599fd7687720fdb78021f6c23a07fdd0533bbdc83dc1e97a16a161"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.443791 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.443818 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.449016 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" event={"ID":"ca1756aa-c8c1-4f8e-9871-05e044a80c84","Type":"ContainerStarted","Data":"9d13207bfa59faf596deb2d40a70b14097428a29e9cd2f29e431ec69fafe695f"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.449075 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" event={"ID":"ca1756aa-c8c1-4f8e-9871-05e044a80c84","Type":"ContainerStarted","Data":"0fa47ced9f0a1a66931599424fb0e02e42c9c45fd055acdeb51c078cfec19eb2"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.456764 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-695f7dfd45-zbb58" event={"ID":"ac6a9582-6a97-46b4-aa84-35ca9abe695c","Type":"ContainerStarted","Data":"42504908b6e8629c4bfd13d446379584c5e9631e5f21f9d0d03ceb47fe02eefd"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.467524 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.477322 4775 generic.go:334] "Generic (PLEG): container finished" podID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerID="2ea013924b4f290fa084967e63882264b54bdf3e3f2ae5d4a85e13ca12cc197c" exitCode=137 Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.477360 4775 generic.go:334] "Generic (PLEG): container finished" podID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerID="5506d184fe477b46386663b63596691c1993b133b8a155542ea5cad65532df49" exitCode=137 Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.477427 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c78fd876f-8p4lr" event={"ID":"29a2a294-6d96-4169-9be8-7109251bf8b1","Type":"ContainerDied","Data":"2ea013924b4f290fa084967e63882264b54bdf3e3f2ae5d4a85e13ca12cc197c"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.477469 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c78fd876f-8p4lr" event={"ID":"29a2a294-6d96-4169-9be8-7109251bf8b1","Type":"ContainerDied","Data":"5506d184fe477b46386663b63596691c1993b133b8a155542ea5cad65532df49"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.478396 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f57cbf767-xvk7k" event={"ID":"17e205ad-6676-4f5d-b9d0-0d8c958d815d","Type":"ContainerStarted","Data":"53a128ffc6e310fa157dfd37a105cff396b2195c605357ef2976ef48f28caaf9"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.481875 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-8bc6678d8-674l9" podStartSLOduration=3.481859139 podStartE2EDuration="3.481859139s" podCreationTimestamp="2026-01-27 11:38:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:09.467930387 +0000 UTC m=+1068.609528164" watchObservedRunningTime="2026-01-27 11:38:09.481859139 +0000 UTC m=+1068.623456916" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.502012 4775 generic.go:334] "Generic (PLEG): container finished" podID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerID="8e118e849fbf875dde2f05c2e98a8511d2d701c095eaa63e50b73abe199d91fe" exitCode=0 Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.502107 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b847b569-ccplz" event={"ID":"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c","Type":"ContainerDied","Data":"8e118e849fbf875dde2f05c2e98a8511d2d701c095eaa63e50b73abe199d91fe"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.506895 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.556416 4775 generic.go:334] "Generic (PLEG): container finished" podID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerID="27965c735360621fc3e3960fb4bac6c83e5f074ce46fbbf9d72eadc3af3a359f" exitCode=137 Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.556747 4775 generic.go:334] "Generic (PLEG): container finished" podID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerID="5a7b8b818080f5556f5d65d07c2be8e6283d041522c2dd938c726bf295f59bde" exitCode=137 Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.556789 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cf66fb49-4l4kc" event={"ID":"c73cda8b-d244-4ad1-8f54-f5680565327d","Type":"ContainerDied","Data":"27965c735360621fc3e3960fb4bac6c83e5f074ce46fbbf9d72eadc3af3a359f"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.556824 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cf66fb49-4l4kc" event={"ID":"c73cda8b-d244-4ad1-8f54-f5680565327d","Type":"ContainerDied","Data":"5a7b8b818080f5556f5d65d07c2be8e6283d041522c2dd938c726bf295f59bde"} Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.573123 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-695f7dfd45-zbb58" podStartSLOduration=5.564138901 podStartE2EDuration="9.573103191s" podCreationTimestamp="2026-01-27 11:38:00 +0000 UTC" firstStartedPulling="2026-01-27 11:38:03.831957919 +0000 UTC m=+1062.973555696" lastFinishedPulling="2026-01-27 11:38:07.840922209 +0000 UTC m=+1066.982519986" observedRunningTime="2026-01-27 11:38:09.553612166 +0000 UTC m=+1068.695209943" watchObservedRunningTime="2026-01-27 11:38:09.573103191 +0000 UTC m=+1068.714700968" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.591596 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" podStartSLOduration=5.8064087650000005 podStartE2EDuration="9.591572387s" podCreationTimestamp="2026-01-27 11:38:00 +0000 UTC" firstStartedPulling="2026-01-27 11:38:04.081677137 +0000 UTC m=+1063.223274914" lastFinishedPulling="2026-01-27 11:38:07.866840759 +0000 UTC m=+1067.008438536" observedRunningTime="2026-01-27 11:38:09.526431501 +0000 UTC m=+1068.668029278" watchObservedRunningTime="2026-01-27 11:38:09.591572387 +0000 UTC m=+1068.733170164" Jan 27 11:38:09 crc kubenswrapper[4775]: W0127 11:38:09.593311 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf43a36d6_24df_43c5_9d20_aaa35c11f855.slice/crio-28b64b4fcdfe6c67d081958bb4e6c186a5ea1015e7bbc85f180d20d2234b064c WatchSource:0}: Error finding container 28b64b4fcdfe6c67d081958bb4e6c186a5ea1015e7bbc85f180d20d2234b064c: Status 404 returned error can't find the container with id 28b64b4fcdfe6c67d081958bb4e6c186a5ea1015e7bbc85f180d20d2234b064c Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.626477 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.636880 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.655078 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.656810 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.662857 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.663683 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.663952 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.664229 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.757910 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a82d041-4b07-491a-8af6-232e67a23299" path="/var/lib/kubelet/pods/8a82d041-4b07-491a-8af6-232e67a23299/volumes" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.759140 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be" path="/var/lib/kubelet/pods/e8ddf5ca-9d7d-4e24-9392-103ba0c6d4be/volumes" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.759945 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data-custom\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760078 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-public-tls-certs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760170 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29838f60-9966-4962-9842-b6010abc1468-etc-machine-id\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760186 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-scripts\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760210 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29838f60-9966-4962-9842-b6010abc1468-logs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760237 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760293 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760319 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.760362 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wz5s\" (UniqueName: \"kubernetes.io/projected/29838f60-9966-4962-9842-b6010abc1468-kube-api-access-9wz5s\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863544 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863648 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863678 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863717 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wz5s\" (UniqueName: \"kubernetes.io/projected/29838f60-9966-4962-9842-b6010abc1468-kube-api-access-9wz5s\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863844 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data-custom\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863890 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-public-tls-certs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863956 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29838f60-9966-4962-9842-b6010abc1468-etc-machine-id\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863978 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-scripts\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.863999 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29838f60-9966-4962-9842-b6010abc1468-logs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.865252 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29838f60-9966-4962-9842-b6010abc1468-etc-machine-id\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.866211 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29838f60-9966-4962-9842-b6010abc1468-logs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.868316 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-public-tls-certs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.870253 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-scripts\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.870602 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.871126 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.875615 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.878823 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data-custom\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.884391 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wz5s\" (UniqueName: \"kubernetes.io/projected/29838f60-9966-4962-9842-b6010abc1468-kube-api-access-9wz5s\") pod \"cinder-api-0\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " pod="openstack/cinder-api-0" Jan 27 11:38:09 crc kubenswrapper[4775]: I0127 11:38:09.943960 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.030592 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.077305 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-scripts\") pod \"c73cda8b-d244-4ad1-8f54-f5680565327d\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.077360 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c73cda8b-d244-4ad1-8f54-f5680565327d-logs\") pod \"c73cda8b-d244-4ad1-8f54-f5680565327d\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.077387 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-config-data\") pod \"c73cda8b-d244-4ad1-8f54-f5680565327d\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.077442 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngfh4\" (UniqueName: \"kubernetes.io/projected/c73cda8b-d244-4ad1-8f54-f5680565327d-kube-api-access-ngfh4\") pod \"c73cda8b-d244-4ad1-8f54-f5680565327d\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.077586 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c73cda8b-d244-4ad1-8f54-f5680565327d-horizon-secret-key\") pod \"c73cda8b-d244-4ad1-8f54-f5680565327d\" (UID: \"c73cda8b-d244-4ad1-8f54-f5680565327d\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.078295 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c73cda8b-d244-4ad1-8f54-f5680565327d-logs" (OuterVolumeSpecName: "logs") pod "c73cda8b-d244-4ad1-8f54-f5680565327d" (UID: "c73cda8b-d244-4ad1-8f54-f5680565327d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.082919 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c73cda8b-d244-4ad1-8f54-f5680565327d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c73cda8b-d244-4ad1-8f54-f5680565327d" (UID: "c73cda8b-d244-4ad1-8f54-f5680565327d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.085438 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c73cda8b-d244-4ad1-8f54-f5680565327d-kube-api-access-ngfh4" (OuterVolumeSpecName: "kube-api-access-ngfh4") pod "c73cda8b-d244-4ad1-8f54-f5680565327d" (UID: "c73cda8b-d244-4ad1-8f54-f5680565327d"). InnerVolumeSpecName "kube-api-access-ngfh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.126060 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-config-data" (OuterVolumeSpecName: "config-data") pod "c73cda8b-d244-4ad1-8f54-f5680565327d" (UID: "c73cda8b-d244-4ad1-8f54-f5680565327d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.136728 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-scripts" (OuterVolumeSpecName: "scripts") pod "c73cda8b-d244-4ad1-8f54-f5680565327d" (UID: "c73cda8b-d244-4ad1-8f54-f5680565327d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.179185 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.179213 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c73cda8b-d244-4ad1-8f54-f5680565327d-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.179222 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c73cda8b-d244-4ad1-8f54-f5680565327d-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.179231 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngfh4\" (UniqueName: \"kubernetes.io/projected/c73cda8b-d244-4ad1-8f54-f5680565327d-kube-api-access-ngfh4\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.179261 4775 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c73cda8b-d244-4ad1-8f54-f5680565327d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.281186 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.383794 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/29a2a294-6d96-4169-9be8-7109251bf8b1-horizon-secret-key\") pod \"29a2a294-6d96-4169-9be8-7109251bf8b1\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.384171 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krmx7\" (UniqueName: \"kubernetes.io/projected/29a2a294-6d96-4169-9be8-7109251bf8b1-kube-api-access-krmx7\") pod \"29a2a294-6d96-4169-9be8-7109251bf8b1\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.384226 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-config-data\") pod \"29a2a294-6d96-4169-9be8-7109251bf8b1\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.384280 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29a2a294-6d96-4169-9be8-7109251bf8b1-logs\") pod \"29a2a294-6d96-4169-9be8-7109251bf8b1\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.384346 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-scripts\") pod \"29a2a294-6d96-4169-9be8-7109251bf8b1\" (UID: \"29a2a294-6d96-4169-9be8-7109251bf8b1\") " Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.386766 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29a2a294-6d96-4169-9be8-7109251bf8b1-logs" (OuterVolumeSpecName: "logs") pod "29a2a294-6d96-4169-9be8-7109251bf8b1" (UID: "29a2a294-6d96-4169-9be8-7109251bf8b1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.395636 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29a2a294-6d96-4169-9be8-7109251bf8b1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "29a2a294-6d96-4169-9be8-7109251bf8b1" (UID: "29a2a294-6d96-4169-9be8-7109251bf8b1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.395737 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29a2a294-6d96-4169-9be8-7109251bf8b1-kube-api-access-krmx7" (OuterVolumeSpecName: "kube-api-access-krmx7") pod "29a2a294-6d96-4169-9be8-7109251bf8b1" (UID: "29a2a294-6d96-4169-9be8-7109251bf8b1"). InnerVolumeSpecName "kube-api-access-krmx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.465983 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-config-data" (OuterVolumeSpecName: "config-data") pod "29a2a294-6d96-4169-9be8-7109251bf8b1" (UID: "29a2a294-6d96-4169-9be8-7109251bf8b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.470484 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-scripts" (OuterVolumeSpecName: "scripts") pod "29a2a294-6d96-4169-9be8-7109251bf8b1" (UID: "29a2a294-6d96-4169-9be8-7109251bf8b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.485932 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.485954 4775 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/29a2a294-6d96-4169-9be8-7109251bf8b1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.485965 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krmx7\" (UniqueName: \"kubernetes.io/projected/29a2a294-6d96-4169-9be8-7109251bf8b1-kube-api-access-krmx7\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.485973 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/29a2a294-6d96-4169-9be8-7109251bf8b1-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.485981 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29a2a294-6d96-4169-9be8-7109251bf8b1-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.581634 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.592513 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58cf66fb49-4l4kc" event={"ID":"c73cda8b-d244-4ad1-8f54-f5680565327d","Type":"ContainerDied","Data":"c249bdd94a125524e988795b71a7762c676a0ef2577e0640b92316f827a03d2f"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.592938 4775 scope.go:117] "RemoveContainer" containerID="27965c735360621fc3e3960fb4bac6c83e5f074ce46fbbf9d72eadc3af3a359f" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.593175 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58cf66fb49-4l4kc" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.630992 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerStarted","Data":"c9b6a0c545f10363ab83ee451af24f75b0c3422868d2657358c693fd0f9f4e66"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.631043 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerStarted","Data":"28b64b4fcdfe6c67d081958bb4e6c186a5ea1015e7bbc85f180d20d2234b064c"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.639889 4775 generic.go:334] "Generic (PLEG): container finished" podID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerID="69acbe0e1dbc2111ef595f05096451e17cc913c47831643c290c11171c0a8d99" exitCode=137 Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.639932 4775 generic.go:334] "Generic (PLEG): container finished" podID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerID="d741f03877ce7a29e41d06ab00c0d5e162e792f15a5fb3cb77d4cd2ce96127c2" exitCode=137 Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.639937 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6cd994f7-2jm86" event={"ID":"dd14daeb-9a49-4720-9c96-b6caf1257d5a","Type":"ContainerDied","Data":"69acbe0e1dbc2111ef595f05096451e17cc913c47831643c290c11171c0a8d99"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.639987 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6cd994f7-2jm86" event={"ID":"dd14daeb-9a49-4720-9c96-b6caf1257d5a","Type":"ContainerDied","Data":"d741f03877ce7a29e41d06ab00c0d5e162e792f15a5fb3cb77d4cd2ce96127c2"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.647249 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2","Type":"ContainerStarted","Data":"91da05472e3595a00e190b2bcb487215369914030746cc03cf5ca234fe185131"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.654271 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c78fd876f-8p4lr" event={"ID":"29a2a294-6d96-4169-9be8-7109251bf8b1","Type":"ContainerDied","Data":"e85e8e0f44ac4f6cbdc0a4bbf06db8528c1a4b4037fff448eea7f2f74eae3616"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.654565 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c78fd876f-8p4lr" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.660011 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58cf66fb49-4l4kc"] Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.682191 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-58cf66fb49-4l4kc"] Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.684704 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.515606653 podStartE2EDuration="10.684692715s" podCreationTimestamp="2026-01-27 11:38:00 +0000 UTC" firstStartedPulling="2026-01-27 11:38:04.082072257 +0000 UTC m=+1063.223670034" lastFinishedPulling="2026-01-27 11:38:08.251158319 +0000 UTC m=+1067.392756096" observedRunningTime="2026-01-27 11:38:10.667947376 +0000 UTC m=+1069.809545153" watchObservedRunningTime="2026-01-27 11:38:10.684692715 +0000 UTC m=+1069.826290492" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.688418 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f57cbf767-xvk7k" event={"ID":"17e205ad-6676-4f5d-b9d0-0d8c958d815d","Type":"ContainerStarted","Data":"59aabef6148d4c27f5f6e5830e2db33d7bd3fb4d58f0d43a0d6775f307bccf5f"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.688520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f57cbf767-xvk7k" event={"ID":"17e205ad-6676-4f5d-b9d0-0d8c958d815d","Type":"ContainerStarted","Data":"0848da506d9d1e315e77e35c04fd69a834a63c3befc2e31f43e2dc6541968a23"} Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.690051 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.726184 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c78fd876f-8p4lr"] Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.738717 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c78fd876f-8p4lr"] Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.746789 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f57cbf767-xvk7k" podStartSLOduration=2.746769007 podStartE2EDuration="2.746769007s" podCreationTimestamp="2026-01-27 11:38:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:10.718531803 +0000 UTC m=+1069.860129600" watchObservedRunningTime="2026-01-27 11:38:10.746769007 +0000 UTC m=+1069.888366804" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.881628 4775 scope.go:117] "RemoveContainer" containerID="5a7b8b818080f5556f5d65d07c2be8e6283d041522c2dd938c726bf295f59bde" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.911833 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.919693 4775 scope.go:117] "RemoveContainer" containerID="2ea013924b4f290fa084967e63882264b54bdf3e3f2ae5d4a85e13ca12cc197c" Jan 27 11:38:10 crc kubenswrapper[4775]: I0127 11:38:10.960022 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.102003 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd14daeb-9a49-4720-9c96-b6caf1257d5a-horizon-secret-key\") pod \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.102146 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-config-data\") pod \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.102189 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-scripts\") pod \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.102250 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqxbs\" (UniqueName: \"kubernetes.io/projected/dd14daeb-9a49-4720-9c96-b6caf1257d5a-kube-api-access-jqxbs\") pod \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.102297 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd14daeb-9a49-4720-9c96-b6caf1257d5a-logs\") pod \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\" (UID: \"dd14daeb-9a49-4720-9c96-b6caf1257d5a\") " Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.103337 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd14daeb-9a49-4720-9c96-b6caf1257d5a-logs" (OuterVolumeSpecName: "logs") pod "dd14daeb-9a49-4720-9c96-b6caf1257d5a" (UID: "dd14daeb-9a49-4720-9c96-b6caf1257d5a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.108510 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd14daeb-9a49-4720-9c96-b6caf1257d5a-kube-api-access-jqxbs" (OuterVolumeSpecName: "kube-api-access-jqxbs") pod "dd14daeb-9a49-4720-9c96-b6caf1257d5a" (UID: "dd14daeb-9a49-4720-9c96-b6caf1257d5a"). InnerVolumeSpecName "kube-api-access-jqxbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.110562 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd14daeb-9a49-4720-9c96-b6caf1257d5a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "dd14daeb-9a49-4720-9c96-b6caf1257d5a" (UID: "dd14daeb-9a49-4720-9c96-b6caf1257d5a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.135697 4775 scope.go:117] "RemoveContainer" containerID="5506d184fe477b46386663b63596691c1993b133b8a155542ea5cad65532df49" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.142822 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-config-data" (OuterVolumeSpecName: "config-data") pod "dd14daeb-9a49-4720-9c96-b6caf1257d5a" (UID: "dd14daeb-9a49-4720-9c96-b6caf1257d5a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.149280 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-scripts" (OuterVolumeSpecName: "scripts") pod "dd14daeb-9a49-4720-9c96-b6caf1257d5a" (UID: "dd14daeb-9a49-4720-9c96-b6caf1257d5a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.204315 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd14daeb-9a49-4720-9c96-b6caf1257d5a-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.204349 4775 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/dd14daeb-9a49-4720-9c96-b6caf1257d5a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.204358 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.204367 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dd14daeb-9a49-4720-9c96-b6caf1257d5a-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.204376 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqxbs\" (UniqueName: \"kubernetes.io/projected/dd14daeb-9a49-4720-9c96-b6caf1257d5a-kube-api-access-jqxbs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.703617 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerStarted","Data":"9df7ce8e17e4380ee4b7c55578b2dda866d82c6471224b6ea2cb8602d082c361"} Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.708062 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29838f60-9966-4962-9842-b6010abc1468","Type":"ContainerStarted","Data":"dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374"} Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.708137 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29838f60-9966-4962-9842-b6010abc1468","Type":"ContainerStarted","Data":"33dee4dc93223d68ed0c9843e6651623dd7c73f98dd4eee5700b9bc73cb6734c"} Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.716739 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5f6cd994f7-2jm86" event={"ID":"dd14daeb-9a49-4720-9c96-b6caf1257d5a","Type":"ContainerDied","Data":"96c45e8e9930bf07afed2f11987b0afd9b083256c7c2af2e8c36913249d87fa8"} Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.716778 4775 scope.go:117] "RemoveContainer" containerID="69acbe0e1dbc2111ef595f05096451e17cc913c47831643c290c11171c0a8d99" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.716795 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5f6cd994f7-2jm86" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.775832 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" path="/var/lib/kubelet/pods/29a2a294-6d96-4169-9be8-7109251bf8b1/volumes" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.776482 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" path="/var/lib/kubelet/pods/c73cda8b-d244-4ad1-8f54-f5680565327d/volumes" Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.887635 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5f6cd994f7-2jm86"] Jan 27 11:38:11 crc kubenswrapper[4775]: I0127 11:38:11.895592 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5f6cd994f7-2jm86"] Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.085773 4775 scope.go:117] "RemoveContainer" containerID="d741f03877ce7a29e41d06ab00c0d5e162e792f15a5fb3cb77d4cd2ce96127c2" Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.740469 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29838f60-9966-4962-9842-b6010abc1468","Type":"ContainerStarted","Data":"89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290"} Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.741129 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.744723 4775 generic.go:334] "Generic (PLEG): container finished" podID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerID="8b2a4356eb5f8df33ebc58ad0b94e8bc53209136a336f43ded79b5472757c90d" exitCode=0 Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.744798 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b847b569-ccplz" event={"ID":"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c","Type":"ContainerDied","Data":"8b2a4356eb5f8df33ebc58ad0b94e8bc53209136a336f43ded79b5472757c90d"} Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.747237 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerStarted","Data":"004b1d31e12b92a12b6611a9cd3172251cdec0a27132ad6e5347a1433fe5b67a"} Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.770954 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.770934167 podStartE2EDuration="3.770934167s" podCreationTimestamp="2026-01-27 11:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:12.764535131 +0000 UTC m=+1071.906132908" watchObservedRunningTime="2026-01-27 11:38:12.770934167 +0000 UTC m=+1071.912531934" Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.824864 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.962534 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-config\") pod \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.962590 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-httpd-config\") pod \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.962610 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-public-tls-certs\") pod \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.962723 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-ovndb-tls-certs\") pod \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.962797 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-internal-tls-certs\") pod \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.962868 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvbbv\" (UniqueName: \"kubernetes.io/projected/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-kube-api-access-lvbbv\") pod \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.962906 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-combined-ca-bundle\") pod \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\" (UID: \"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c\") " Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.968305 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" (UID: "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:12 crc kubenswrapper[4775]: I0127 11:38:12.974141 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-kube-api-access-lvbbv" (OuterVolumeSpecName: "kube-api-access-lvbbv") pod "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" (UID: "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c"). InnerVolumeSpecName "kube-api-access-lvbbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.016974 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" (UID: "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.019170 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-config" (OuterVolumeSpecName: "config") pod "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" (UID: "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.028029 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" (UID: "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.035872 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" (UID: "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.043183 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" (UID: "a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.064937 4775 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.064974 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.064984 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvbbv\" (UniqueName: \"kubernetes.io/projected/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-kube-api-access-lvbbv\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.064997 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.065006 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.065015 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.065023 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.208319 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.365006 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.758948 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" path="/var/lib/kubelet/pods/dd14daeb-9a49-4720-9c96-b6caf1257d5a/volumes" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.761560 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-55b847b569-ccplz" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.762299 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-55b847b569-ccplz" event={"ID":"a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c","Type":"ContainerDied","Data":"129e86fff0154f3e4de3082e715fe1284c270556711420ae01c9066fffafb3c8"} Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.762358 4775 scope.go:117] "RemoveContainer" containerID="8e118e849fbf875dde2f05c2e98a8511d2d701c095eaa63e50b73abe199d91fe" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.799986 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-55b847b569-ccplz"] Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.810494 4775 scope.go:117] "RemoveContainer" containerID="8b2a4356eb5f8df33ebc58ad0b94e8bc53209136a336f43ded79b5472757c90d" Jan 27 11:38:13 crc kubenswrapper[4775]: I0127 11:38:13.817263 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-55b847b569-ccplz"] Jan 27 11:38:14 crc kubenswrapper[4775]: I0127 11:38:14.771643 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerStarted","Data":"757f8cda3b6f903a401192990356764bd59a5026006946e21249f4fd71282e30"} Jan 27 11:38:14 crc kubenswrapper[4775]: I0127 11:38:14.772948 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:38:14 crc kubenswrapper[4775]: I0127 11:38:14.822085 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.921356365 podStartE2EDuration="6.822058536s" podCreationTimestamp="2026-01-27 11:38:08 +0000 UTC" firstStartedPulling="2026-01-27 11:38:09.635157143 +0000 UTC m=+1068.776754920" lastFinishedPulling="2026-01-27 11:38:13.535859314 +0000 UTC m=+1072.677457091" observedRunningTime="2026-01-27 11:38:14.806418527 +0000 UTC m=+1073.948016324" watchObservedRunningTime="2026-01-27 11:38:14.822058536 +0000 UTC m=+1073.963656313" Jan 27 11:38:15 crc kubenswrapper[4775]: I0127 11:38:15.758158 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" path="/var/lib/kubelet/pods/a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c/volumes" Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.176497 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.223199 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.590625 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.638389 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-7jpkg"] Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.638642 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" podUID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerName="dnsmasq-dns" containerID="cri-o://a245340eb78d137ed3cb9c7df3352fab2464ec2b62b40355e4e4eb0fc55e898a" gracePeriod=10 Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.805437 4775 generic.go:334] "Generic (PLEG): container finished" podID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerID="a245340eb78d137ed3cb9c7df3352fab2464ec2b62b40355e4e4eb0fc55e898a" exitCode=0 Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.805603 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" event={"ID":"558b9501-01cb-43ac-aed0-f0cbc868ce59","Type":"ContainerDied","Data":"a245340eb78d137ed3cb9c7df3352fab2464ec2b62b40355e4e4eb0fc55e898a"} Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.806294 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="cinder-scheduler" containerID="cri-o://93c0e1e738416356c2758621400d93df83887d8dd15b0d587e3b64d7e4898cf8" gracePeriod=30 Jan 27 11:38:16 crc kubenswrapper[4775]: I0127 11:38:16.806705 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="probe" containerID="cri-o://91da05472e3595a00e190b2bcb487215369914030746cc03cf5ca234fe185131" gracePeriod=30 Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.196039 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.352510 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-config\") pod \"558b9501-01cb-43ac-aed0-f0cbc868ce59\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.352566 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-nb\") pod \"558b9501-01cb-43ac-aed0-f0cbc868ce59\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.352595 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-sb\") pod \"558b9501-01cb-43ac-aed0-f0cbc868ce59\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.352646 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-svc\") pod \"558b9501-01cb-43ac-aed0-f0cbc868ce59\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.352826 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwjs5\" (UniqueName: \"kubernetes.io/projected/558b9501-01cb-43ac-aed0-f0cbc868ce59-kube-api-access-vwjs5\") pod \"558b9501-01cb-43ac-aed0-f0cbc868ce59\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.352867 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-swift-storage-0\") pod \"558b9501-01cb-43ac-aed0-f0cbc868ce59\" (UID: \"558b9501-01cb-43ac-aed0-f0cbc868ce59\") " Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.387623 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/558b9501-01cb-43ac-aed0-f0cbc868ce59-kube-api-access-vwjs5" (OuterVolumeSpecName: "kube-api-access-vwjs5") pod "558b9501-01cb-43ac-aed0-f0cbc868ce59" (UID: "558b9501-01cb-43ac-aed0-f0cbc868ce59"). InnerVolumeSpecName "kube-api-access-vwjs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.411182 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "558b9501-01cb-43ac-aed0-f0cbc868ce59" (UID: "558b9501-01cb-43ac-aed0-f0cbc868ce59"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.430906 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "558b9501-01cb-43ac-aed0-f0cbc868ce59" (UID: "558b9501-01cb-43ac-aed0-f0cbc868ce59"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.435840 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "558b9501-01cb-43ac-aed0-f0cbc868ce59" (UID: "558b9501-01cb-43ac-aed0-f0cbc868ce59"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.460823 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwjs5\" (UniqueName: \"kubernetes.io/projected/558b9501-01cb-43ac-aed0-f0cbc868ce59-kube-api-access-vwjs5\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.460856 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.460865 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.460874 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.462933 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "558b9501-01cb-43ac-aed0-f0cbc868ce59" (UID: "558b9501-01cb-43ac-aed0-f0cbc868ce59"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.472113 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-config" (OuterVolumeSpecName: "config") pod "558b9501-01cb-43ac-aed0-f0cbc868ce59" (UID: "558b9501-01cb-43ac-aed0-f0cbc868ce59"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.568669 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.568717 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/558b9501-01cb-43ac-aed0-f0cbc868ce59-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.803796 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.806120 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.816619 4775 generic.go:334] "Generic (PLEG): container finished" podID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerID="91da05472e3595a00e190b2bcb487215369914030746cc03cf5ca234fe185131" exitCode=0 Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.816726 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2","Type":"ContainerDied","Data":"91da05472e3595a00e190b2bcb487215369914030746cc03cf5ca234fe185131"} Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.819726 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" event={"ID":"558b9501-01cb-43ac-aed0-f0cbc868ce59","Type":"ContainerDied","Data":"f613d08fcd685ed44899c259a171ad733b3147458ae9f365bbc1e423524fcf00"} Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.819789 4775 scope.go:117] "RemoveContainer" containerID="a245340eb78d137ed3cb9c7df3352fab2464ec2b62b40355e4e4eb0fc55e898a" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.819963 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b9c8b59c-7jpkg" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.848251 4775 scope.go:117] "RemoveContainer" containerID="93626448b8ab20fd608cb51c7a09b76b9375b10a91e3ff2ab81efb1aa8fdb168" Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.881032 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-7jpkg"] Jan 27 11:38:17 crc kubenswrapper[4775]: I0127 11:38:17.896800 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b9c8b59c-7jpkg"] Jan 27 11:38:18 crc kubenswrapper[4775]: I0127 11:38:18.416960 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:18 crc kubenswrapper[4775]: I0127 11:38:18.459584 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:38:18 crc kubenswrapper[4775]: I0127 11:38:18.521411 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7696d8466d-w52tt"] Jan 27 11:38:18 crc kubenswrapper[4775]: I0127 11:38:18.530828 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7696d8466d-w52tt" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api-log" containerID="cri-o://e8609affd7b82a5a8a2b23b648cb0dca487cea8d7f1754f41f4cbf90181492f0" gracePeriod=30 Jan 27 11:38:18 crc kubenswrapper[4775]: I0127 11:38:18.531410 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7696d8466d-w52tt" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api" containerID="cri-o://f6b1261e70bbd30706bfbb925d2178443a048afc57cb4157e9e1e03777faecb2" gracePeriod=30 Jan 27 11:38:18 crc kubenswrapper[4775]: I0127 11:38:18.828469 4775 generic.go:334] "Generic (PLEG): container finished" podID="837199be-1d46-4982-93ee-3f28a585d1d0" containerID="e8609affd7b82a5a8a2b23b648cb0dca487cea8d7f1754f41f4cbf90181492f0" exitCode=143 Jan 27 11:38:18 crc kubenswrapper[4775]: I0127 11:38:18.828661 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7696d8466d-w52tt" event={"ID":"837199be-1d46-4982-93ee-3f28a585d1d0","Type":"ContainerDied","Data":"e8609affd7b82a5a8a2b23b648cb0dca487cea8d7f1754f41f4cbf90181492f0"} Jan 27 11:38:19 crc kubenswrapper[4775]: I0127 11:38:19.634033 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:38:19 crc kubenswrapper[4775]: I0127 11:38:19.758417 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="558b9501-01cb-43ac-aed0-f0cbc868ce59" path="/var/lib/kubelet/pods/558b9501-01cb-43ac-aed0-f0cbc868ce59/volumes" Jan 27 11:38:19 crc kubenswrapper[4775]: I0127 11:38:19.890134 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6546ffcc78-4zdnk" Jan 27 11:38:19 crc kubenswrapper[4775]: I0127 11:38:19.966845 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84666cddfd-6l8vq"] Jan 27 11:38:19 crc kubenswrapper[4775]: I0127 11:38:19.967122 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon-log" containerID="cri-o://b63cf0e89854369b83ebb263e9838c2cb8b2524c2ff119bacd1526747a2980ff" gracePeriod=30 Jan 27 11:38:19 crc kubenswrapper[4775]: I0127 11:38:19.967236 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" containerID="cri-o://0eb18ea0a7e8522aa14ee450ec18f20609f48386c58320c99cc54df7dfbb3f2d" gracePeriod=30 Jan 27 11:38:20 crc kubenswrapper[4775]: I0127 11:38:20.866609 4775 generic.go:334] "Generic (PLEG): container finished" podID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerID="93c0e1e738416356c2758621400d93df83887d8dd15b0d587e3b64d7e4898cf8" exitCode=0 Jan 27 11:38:20 crc kubenswrapper[4775]: I0127 11:38:20.866658 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2","Type":"ContainerDied","Data":"93c0e1e738416356c2758621400d93df83887d8dd15b0d587e3b64d7e4898cf8"} Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.203591 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.346633 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvtx5\" (UniqueName: \"kubernetes.io/projected/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-kube-api-access-xvtx5\") pod \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.346994 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-scripts\") pod \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.347051 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-combined-ca-bundle\") pod \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.347098 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data\") pod \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.347188 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-etc-machine-id\") pod \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.347213 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data-custom\") pod \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\" (UID: \"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2\") " Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.348264 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" (UID: "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.353465 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-scripts" (OuterVolumeSpecName: "scripts") pod "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" (UID: "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.353473 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-kube-api-access-xvtx5" (OuterVolumeSpecName: "kube-api-access-xvtx5") pod "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" (UID: "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2"). InnerVolumeSpecName "kube-api-access-xvtx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.354170 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" (UID: "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.411531 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" (UID: "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.449249 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.449283 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.449293 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvtx5\" (UniqueName: \"kubernetes.io/projected/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-kube-api-access-xvtx5\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.449304 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.449313 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.468236 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data" (OuterVolumeSpecName: "config-data") pod "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" (UID: "48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.550816 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.700301 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7696d8466d-w52tt" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:48020->10.217.0.163:9311: read: connection reset by peer" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.702414 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7696d8466d-w52tt" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:48028->10.217.0.163:9311: read: connection reset by peer" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.869197 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.917954 4775 generic.go:334] "Generic (PLEG): container finished" podID="837199be-1d46-4982-93ee-3f28a585d1d0" containerID="f6b1261e70bbd30706bfbb925d2178443a048afc57cb4157e9e1e03777faecb2" exitCode=0 Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.918150 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7696d8466d-w52tt" event={"ID":"837199be-1d46-4982-93ee-3f28a585d1d0","Type":"ContainerDied","Data":"f6b1261e70bbd30706bfbb925d2178443a048afc57cb4157e9e1e03777faecb2"} Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.922962 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2","Type":"ContainerDied","Data":"03fcbf6ca88140ee0c7c54aff2f27534dbded1c1f0d7ef78fa4f4153e2db46f4"} Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.922998 4775 scope.go:117] "RemoveContainer" containerID="91da05472e3595a00e190b2bcb487215369914030746cc03cf5ca234fe185131" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.923110 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.963598 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.970595 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.995569 4775 scope.go:117] "RemoveContainer" containerID="93c0e1e738416356c2758621400d93df83887d8dd15b0d587e3b64d7e4898cf8" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.995675 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996011 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="probe" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996025 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="probe" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996040 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996049 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996064 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996071 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996085 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996092 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996105 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996112 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996129 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-api" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996136 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-api" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996146 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="cinder-scheduler" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996153 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="cinder-scheduler" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996165 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996173 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996183 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996192 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996210 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerName="dnsmasq-dns" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996219 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerName="dnsmasq-dns" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996228 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-httpd" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996234 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-httpd" Jan 27 11:38:21 crc kubenswrapper[4775]: E0127 11:38:21.996248 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerName="init" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996257 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerName="init" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996484 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996495 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996512 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996520 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="558b9501-01cb-43ac-aed0-f0cbc868ce59" containerName="dnsmasq-dns" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996531 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="probe" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996546 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c73cda8b-d244-4ad1-8f54-f5680565327d" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996554 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="29a2a294-6d96-4169-9be8-7109251bf8b1" containerName="horizon-log" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996562 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" containerName="cinder-scheduler" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996573 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd14daeb-9a49-4720-9c96-b6caf1257d5a" containerName="horizon" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996583 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-api" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.996591 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0ee3c72-3eb3-48bc-8d4b-c00c9f36296c" containerName="neutron-httpd" Jan 27 11:38:21 crc kubenswrapper[4775]: I0127 11:38:21.997736 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.001919 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.031215 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.163523 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.163602 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.163635 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7287c167-2d78-4766-b072-0762f4c4d504-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.163662 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctt6g\" (UniqueName: \"kubernetes.io/projected/7287c167-2d78-4766-b072-0762f4c4d504-kube-api-access-ctt6g\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.163699 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-scripts\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.163713 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.188357 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.265473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.265985 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.266167 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7287c167-2d78-4766-b072-0762f4c4d504-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.266343 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctt6g\" (UniqueName: \"kubernetes.io/projected/7287c167-2d78-4766-b072-0762f4c4d504-kube-api-access-ctt6g\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.266513 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-scripts\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.266538 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.267349 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7287c167-2d78-4766-b072-0762f4c4d504-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.272883 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.274028 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.274058 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.275533 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-scripts\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.285588 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctt6g\" (UniqueName: \"kubernetes.io/projected/7287c167-2d78-4766-b072-0762f4c4d504-kube-api-access-ctt6g\") pod \"cinder-scheduler-0\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.317549 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.367409 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjd8t\" (UniqueName: \"kubernetes.io/projected/837199be-1d46-4982-93ee-3f28a585d1d0-kube-api-access-xjd8t\") pod \"837199be-1d46-4982-93ee-3f28a585d1d0\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.367492 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data\") pod \"837199be-1d46-4982-93ee-3f28a585d1d0\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.367521 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-combined-ca-bundle\") pod \"837199be-1d46-4982-93ee-3f28a585d1d0\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.367601 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data-custom\") pod \"837199be-1d46-4982-93ee-3f28a585d1d0\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.367754 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837199be-1d46-4982-93ee-3f28a585d1d0-logs\") pod \"837199be-1d46-4982-93ee-3f28a585d1d0\" (UID: \"837199be-1d46-4982-93ee-3f28a585d1d0\") " Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.369283 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/837199be-1d46-4982-93ee-3f28a585d1d0-logs" (OuterVolumeSpecName: "logs") pod "837199be-1d46-4982-93ee-3f28a585d1d0" (UID: "837199be-1d46-4982-93ee-3f28a585d1d0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.374274 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/837199be-1d46-4982-93ee-3f28a585d1d0-kube-api-access-xjd8t" (OuterVolumeSpecName: "kube-api-access-xjd8t") pod "837199be-1d46-4982-93ee-3f28a585d1d0" (UID: "837199be-1d46-4982-93ee-3f28a585d1d0"). InnerVolumeSpecName "kube-api-access-xjd8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.376500 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "837199be-1d46-4982-93ee-3f28a585d1d0" (UID: "837199be-1d46-4982-93ee-3f28a585d1d0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.397421 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "837199be-1d46-4982-93ee-3f28a585d1d0" (UID: "837199be-1d46-4982-93ee-3f28a585d1d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.438888 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data" (OuterVolumeSpecName: "config-data") pod "837199be-1d46-4982-93ee-3f28a585d1d0" (UID: "837199be-1d46-4982-93ee-3f28a585d1d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.470345 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/837199be-1d46-4982-93ee-3f28a585d1d0-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.470378 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjd8t\" (UniqueName: \"kubernetes.io/projected/837199be-1d46-4982-93ee-3f28a585d1d0-kube-api-access-xjd8t\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.470406 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.470416 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.470426 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/837199be-1d46-4982-93ee-3f28a585d1d0-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.780006 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:22 crc kubenswrapper[4775]: W0127 11:38:22.784505 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7287c167_2d78_4766_b072_0762f4c4d504.slice/crio-bd8ae068132a0cde3e52d3eb2417624a42f137d7a8867511b824433d3a994398 WatchSource:0}: Error finding container bd8ae068132a0cde3e52d3eb2417624a42f137d7a8867511b824433d3a994398: Status 404 returned error can't find the container with id bd8ae068132a0cde3e52d3eb2417624a42f137d7a8867511b824433d3a994398 Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.934678 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7287c167-2d78-4766-b072-0762f4c4d504","Type":"ContainerStarted","Data":"bd8ae068132a0cde3e52d3eb2417624a42f137d7a8867511b824433d3a994398"} Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.936129 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7696d8466d-w52tt" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.936129 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7696d8466d-w52tt" event={"ID":"837199be-1d46-4982-93ee-3f28a585d1d0","Type":"ContainerDied","Data":"0d2d8798bfd1e1511045000c7dea13845346445c7084205b6f8006dd91903bbd"} Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.936190 4775 scope.go:117] "RemoveContainer" containerID="f6b1261e70bbd30706bfbb925d2178443a048afc57cb4157e9e1e03777faecb2" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.972186 4775 scope.go:117] "RemoveContainer" containerID="e8609affd7b82a5a8a2b23b648cb0dca487cea8d7f1754f41f4cbf90181492f0" Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.974168 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7696d8466d-w52tt"] Jan 27 11:38:22 crc kubenswrapper[4775]: I0127 11:38:22.981004 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7696d8466d-w52tt"] Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.226284 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.380661 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5994598694-dhq5v" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.758174 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2" path="/var/lib/kubelet/pods/48cf8210-a82a-4a2c-9c3f-b4f28cc3e4b2/volumes" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.759397 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" path="/var/lib/kubelet/pods/837199be-1d46-4982-93ee-3f28a585d1d0/volumes" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.792214 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 27 11:38:23 crc kubenswrapper[4775]: E0127 11:38:23.792645 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.792658 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api" Jan 27 11:38:23 crc kubenswrapper[4775]: E0127 11:38:23.792678 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api-log" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.792686 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api-log" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.792888 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.792908 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="837199be-1d46-4982-93ee-3f28a585d1d0" containerName="barbican-api-log" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.793575 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.796349 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.796650 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-j8z7v" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.798065 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.805015 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.901786 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-combined-ca-bundle\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.901875 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config-secret\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.901948 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqdqk\" (UniqueName: \"kubernetes.io/projected/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-kube-api-access-bqdqk\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.901978 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.953498 4775 generic.go:334] "Generic (PLEG): container finished" podID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerID="0eb18ea0a7e8522aa14ee450ec18f20609f48386c58320c99cc54df7dfbb3f2d" exitCode=0 Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.953558 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84666cddfd-6l8vq" event={"ID":"98c20582-df9c-4ed1-8c42-0d5d1783e6f4","Type":"ContainerDied","Data":"0eb18ea0a7e8522aa14ee450ec18f20609f48386c58320c99cc54df7dfbb3f2d"} Jan 27 11:38:23 crc kubenswrapper[4775]: I0127 11:38:23.959856 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7287c167-2d78-4766-b072-0762f4c4d504","Type":"ContainerStarted","Data":"bbb066bf267b9b4c21870b464097c872ce5e07c929ddc57dfd10b2d4417b3e8c"} Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.003269 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.003308 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqdqk\" (UniqueName: \"kubernetes.io/projected/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-kube-api-access-bqdqk\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.003356 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-combined-ca-bundle\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.003434 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config-secret\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.004347 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.007823 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config-secret\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.007936 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-combined-ca-bundle\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.021711 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqdqk\" (UniqueName: \"kubernetes.io/projected/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-kube-api-access-bqdqk\") pod \"openstackclient\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.049967 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.050683 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.060701 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.146385 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.147560 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.156679 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.191159 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:38:24 crc kubenswrapper[4775]: E0127 11:38:24.196871 4775 log.go:32] "RunPodSandbox from runtime service failed" err=< Jan 27 11:38:24 crc kubenswrapper[4775]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_66dbc089-aa1e-46ef-a8a8-c3fdb1f590af_0(539589a09340e570e02aa651aa05cce203dbde3391c47662de0cad585446f634): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"539589a09340e570e02aa651aa05cce203dbde3391c47662de0cad585446f634" Netns:"/var/run/netns/ae8e53bd-aec2-44a5-9f5b-93c2c01aba92" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=539589a09340e570e02aa651aa05cce203dbde3391c47662de0cad585446f634;K8S_POD_UID=66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af]: expected pod UID "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" but got "db40a4a8-ce91-40a6-8b63-ccc17ed327da" from Kube API Jan 27 11:38:24 crc kubenswrapper[4775]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 27 11:38:24 crc kubenswrapper[4775]: > Jan 27 11:38:24 crc kubenswrapper[4775]: E0127 11:38:24.196938 4775 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Jan 27 11:38:24 crc kubenswrapper[4775]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_66dbc089-aa1e-46ef-a8a8-c3fdb1f590af_0(539589a09340e570e02aa651aa05cce203dbde3391c47662de0cad585446f634): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"539589a09340e570e02aa651aa05cce203dbde3391c47662de0cad585446f634" Netns:"/var/run/netns/ae8e53bd-aec2-44a5-9f5b-93c2c01aba92" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=539589a09340e570e02aa651aa05cce203dbde3391c47662de0cad585446f634;K8S_POD_UID=66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af]: expected pod UID "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" but got "db40a4a8-ce91-40a6-8b63-ccc17ed327da" from Kube API Jan 27 11:38:24 crc kubenswrapper[4775]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Jan 27 11:38:24 crc kubenswrapper[4775]: > pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.308416 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgxv7\" (UniqueName: \"kubernetes.io/projected/db40a4a8-ce91-40a6-8b63-ccc17ed327da-kube-api-access-qgxv7\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.308834 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/db40a4a8-ce91-40a6-8b63-ccc17ed327da-openstack-config\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.308889 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db40a4a8-ce91-40a6-8b63-ccc17ed327da-combined-ca-bundle\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.308917 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/db40a4a8-ce91-40a6-8b63-ccc17ed327da-openstack-config-secret\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.411024 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgxv7\" (UniqueName: \"kubernetes.io/projected/db40a4a8-ce91-40a6-8b63-ccc17ed327da-kube-api-access-qgxv7\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.411113 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/db40a4a8-ce91-40a6-8b63-ccc17ed327da-openstack-config\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.411168 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db40a4a8-ce91-40a6-8b63-ccc17ed327da-combined-ca-bundle\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.411195 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/db40a4a8-ce91-40a6-8b63-ccc17ed327da-openstack-config-secret\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.411979 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/db40a4a8-ce91-40a6-8b63-ccc17ed327da-openstack-config\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.415930 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/db40a4a8-ce91-40a6-8b63-ccc17ed327da-openstack-config-secret\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.418053 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db40a4a8-ce91-40a6-8b63-ccc17ed327da-combined-ca-bundle\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.427612 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgxv7\" (UniqueName: \"kubernetes.io/projected/db40a4a8-ce91-40a6-8b63-ccc17ed327da-kube-api-access-qgxv7\") pod \"openstackclient\" (UID: \"db40a4a8-ce91-40a6-8b63-ccc17ed327da\") " pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.472276 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.957542 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.974529 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"db40a4a8-ce91-40a6-8b63-ccc17ed327da","Type":"ContainerStarted","Data":"4866618d875b6b078a88700414f5169eaaf32dbde4a7d1b35c3fd383e5744baf"} Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.976932 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.977545 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7287c167-2d78-4766-b072-0762f4c4d504","Type":"ContainerStarted","Data":"42634da366d0324b3faac04253eb83641574ae12f3e9cc409177c836453b0cb7"} Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.980123 4775 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" podUID="db40a4a8-ce91-40a6-8b63-ccc17ed327da" Jan 27 11:38:24 crc kubenswrapper[4775]: I0127 11:38:24.999269 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.99925412 podStartE2EDuration="3.99925412s" podCreationTimestamp="2026-01-27 11:38:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:24.998300804 +0000 UTC m=+1084.139898581" watchObservedRunningTime="2026-01-27 11:38:24.99925412 +0000 UTC m=+1084.140851897" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.012368 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.131920 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-combined-ca-bundle\") pod \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.131965 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config-secret\") pod \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.132096 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config\") pod \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.132150 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqdqk\" (UniqueName: \"kubernetes.io/projected/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-kube-api-access-bqdqk\") pod \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\" (UID: \"66dbc089-aa1e-46ef-a8a8-c3fdb1f590af\") " Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.134224 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" (UID: "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.137114 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" (UID: "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.144100 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-kube-api-access-bqdqk" (OuterVolumeSpecName: "kube-api-access-bqdqk") pod "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" (UID: "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af"). InnerVolumeSpecName "kube-api-access-bqdqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.144109 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" (UID: "66dbc089-aa1e-46ef-a8a8-c3fdb1f590af"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.233817 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.233846 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqdqk\" (UniqueName: \"kubernetes.io/projected/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-kube-api-access-bqdqk\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.233858 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.233866 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.254550 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.754988 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" path="/var/lib/kubelet/pods/66dbc089-aa1e-46ef-a8a8-c3fdb1f590af/volumes" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.984209 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 27 11:38:25 crc kubenswrapper[4775]: I0127 11:38:25.990521 4775 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="66dbc089-aa1e-46ef-a8a8-c3fdb1f590af" podUID="db40a4a8-ce91-40a6-8b63-ccc17ed327da" Jan 27 11:38:27 crc kubenswrapper[4775]: I0127 11:38:27.318364 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.127247 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.127556 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-central-agent" containerID="cri-o://c9b6a0c545f10363ab83ee451af24f75b0c3422868d2657358c693fd0f9f4e66" gracePeriod=30 Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.127695 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="proxy-httpd" containerID="cri-o://757f8cda3b6f903a401192990356764bd59a5026006946e21249f4fd71282e30" gracePeriod=30 Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.127772 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="sg-core" containerID="cri-o://004b1d31e12b92a12b6611a9cd3172251cdec0a27132ad6e5347a1433fe5b67a" gracePeriod=30 Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.127823 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-notification-agent" containerID="cri-o://9df7ce8e17e4380ee4b7c55578b2dda866d82c6471224b6ea2cb8602d082c361" gracePeriod=30 Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.140317 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.166:3000/\": EOF" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.487735 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-55bc6945f7-5kkp2"] Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.489476 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.494379 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.496605 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.497037 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.512050 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-55bc6945f7-5kkp2"] Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.598359 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-internal-tls-certs\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.598426 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc4bh\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-kube-api-access-fc4bh\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.598496 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-run-httpd\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.598649 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-public-tls-certs\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.598673 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-config-data\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.598905 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-combined-ca-bundle\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.598992 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-etc-swift\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.599016 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-log-httpd\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700591 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-public-tls-certs\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700635 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-config-data\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700687 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-combined-ca-bundle\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700716 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-log-httpd\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700731 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-etc-swift\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700783 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-internal-tls-certs\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700808 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc4bh\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-kube-api-access-fc4bh\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.700828 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-run-httpd\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.701703 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-log-httpd\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.702840 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-run-httpd\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.707321 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-etc-swift\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.708191 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-combined-ca-bundle\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.708942 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-config-data\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.709615 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-public-tls-certs\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.715984 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc4bh\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-kube-api-access-fc4bh\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.719101 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-internal-tls-certs\") pod \"swift-proxy-55bc6945f7-5kkp2\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:28 crc kubenswrapper[4775]: I0127 11:38:28.816516 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.058917 4775 generic.go:334] "Generic (PLEG): container finished" podID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerID="757f8cda3b6f903a401192990356764bd59a5026006946e21249f4fd71282e30" exitCode=0 Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.058949 4775 generic.go:334] "Generic (PLEG): container finished" podID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerID="004b1d31e12b92a12b6611a9cd3172251cdec0a27132ad6e5347a1433fe5b67a" exitCode=2 Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.058957 4775 generic.go:334] "Generic (PLEG): container finished" podID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerID="c9b6a0c545f10363ab83ee451af24f75b0c3422868d2657358c693fd0f9f4e66" exitCode=0 Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.058976 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerDied","Data":"757f8cda3b6f903a401192990356764bd59a5026006946e21249f4fd71282e30"} Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.059002 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerDied","Data":"004b1d31e12b92a12b6611a9cd3172251cdec0a27132ad6e5347a1433fe5b67a"} Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.059012 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerDied","Data":"c9b6a0c545f10363ab83ee451af24f75b0c3422868d2657358c693fd0f9f4e66"} Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.062498 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-p9q28"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.067242 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.099375 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-p9q28"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.175786 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-k4m7t"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.177058 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.197750 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-k4m7t"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.207745 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-6423-account-create-update-h7gvh"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.209022 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.214251 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.217705 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6423-account-create-update-h7gvh"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.217882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/027dfac2-8504-46aa-9302-19df71441688-operator-scripts\") pod \"nova-api-db-create-p9q28\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.218195 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rbr9\" (UniqueName: \"kubernetes.io/projected/027dfac2-8504-46aa-9302-19df71441688-kube-api-access-6rbr9\") pod \"nova-api-db-create-p9q28\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.319602 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/027dfac2-8504-46aa-9302-19df71441688-operator-scripts\") pod \"nova-api-db-create-p9q28\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.319697 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcs7g\" (UniqueName: \"kubernetes.io/projected/b03d69b1-c651-4b79-9ba1-581dc15737a6-kube-api-access-kcs7g\") pod \"nova-api-6423-account-create-update-h7gvh\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.319725 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rbr9\" (UniqueName: \"kubernetes.io/projected/027dfac2-8504-46aa-9302-19df71441688-kube-api-access-6rbr9\") pod \"nova-api-db-create-p9q28\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.319765 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-operator-scripts\") pod \"nova-cell0-db-create-k4m7t\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.319806 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b03d69b1-c651-4b79-9ba1-581dc15737a6-operator-scripts\") pod \"nova-api-6423-account-create-update-h7gvh\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.319848 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c27ln\" (UniqueName: \"kubernetes.io/projected/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-kube-api-access-c27ln\") pod \"nova-cell0-db-create-k4m7t\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.320590 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/027dfac2-8504-46aa-9302-19df71441688-operator-scripts\") pod \"nova-api-db-create-p9q28\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.338064 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-tfv9j"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.339154 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.352368 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-8850-account-create-update-bwmll"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.354122 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.358255 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.361039 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rbr9\" (UniqueName: \"kubernetes.io/projected/027dfac2-8504-46aa-9302-19df71441688-kube-api-access-6rbr9\") pod \"nova-api-db-create-p9q28\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.368148 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tfv9j"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.380639 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8850-account-create-update-bwmll"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.390063 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.420924 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6287027-2778-4115-b173-62b1600d0247-operator-scripts\") pod \"nova-cell1-db-create-tfv9j\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.421004 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b03d69b1-c651-4b79-9ba1-581dc15737a6-operator-scripts\") pod \"nova-api-6423-account-create-update-h7gvh\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.421059 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c27ln\" (UniqueName: \"kubernetes.io/projected/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-kube-api-access-c27ln\") pod \"nova-cell0-db-create-k4m7t\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.421153 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcs7g\" (UniqueName: \"kubernetes.io/projected/b03d69b1-c651-4b79-9ba1-581dc15737a6-kube-api-access-kcs7g\") pod \"nova-api-6423-account-create-update-h7gvh\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.421201 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbhts\" (UniqueName: \"kubernetes.io/projected/d6287027-2778-4115-b173-62b1600d0247-kube-api-access-fbhts\") pod \"nova-cell1-db-create-tfv9j\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.421236 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-operator-scripts\") pod \"nova-cell0-db-create-k4m7t\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.422017 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-operator-scripts\") pod \"nova-cell0-db-create-k4m7t\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.424295 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b03d69b1-c651-4b79-9ba1-581dc15737a6-operator-scripts\") pod \"nova-api-6423-account-create-update-h7gvh\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.444542 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c27ln\" (UniqueName: \"kubernetes.io/projected/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-kube-api-access-c27ln\") pod \"nova-cell0-db-create-k4m7t\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.446709 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcs7g\" (UniqueName: \"kubernetes.io/projected/b03d69b1-c651-4b79-9ba1-581dc15737a6-kube-api-access-kcs7g\") pod \"nova-api-6423-account-create-update-h7gvh\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.493642 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.517035 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-55bc6945f7-5kkp2"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.522656 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeed29be-d561-4bf4-bdc1-c180e1983a3c-operator-scripts\") pod \"nova-cell0-8850-account-create-update-bwmll\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.523004 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbhts\" (UniqueName: \"kubernetes.io/projected/d6287027-2778-4115-b173-62b1600d0247-kube-api-access-fbhts\") pod \"nova-cell1-db-create-tfv9j\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.523047 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6287027-2778-4115-b173-62b1600d0247-operator-scripts\") pod \"nova-cell1-db-create-tfv9j\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.523703 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sb9c\" (UniqueName: \"kubernetes.io/projected/aeed29be-d561-4bf4-bdc1-c180e1983a3c-kube-api-access-7sb9c\") pod \"nova-cell0-8850-account-create-update-bwmll\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.525589 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6287027-2778-4115-b173-62b1600d0247-operator-scripts\") pod \"nova-cell1-db-create-tfv9j\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.537189 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.555232 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbhts\" (UniqueName: \"kubernetes.io/projected/d6287027-2778-4115-b173-62b1600d0247-kube-api-access-fbhts\") pod \"nova-cell1-db-create-tfv9j\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.564574 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8d66-account-create-update-qwzzn"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.566227 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.568943 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.581067 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8d66-account-create-update-qwzzn"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.627881 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sb9c\" (UniqueName: \"kubernetes.io/projected/aeed29be-d561-4bf4-bdc1-c180e1983a3c-kube-api-access-7sb9c\") pod \"nova-cell0-8850-account-create-update-bwmll\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.628157 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeed29be-d561-4bf4-bdc1-c180e1983a3c-operator-scripts\") pod \"nova-cell0-8850-account-create-update-bwmll\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.629360 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeed29be-d561-4bf4-bdc1-c180e1983a3c-operator-scripts\") pod \"nova-cell0-8850-account-create-update-bwmll\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.653237 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sb9c\" (UniqueName: \"kubernetes.io/projected/aeed29be-d561-4bf4-bdc1-c180e1983a3c-kube-api-access-7sb9c\") pod \"nova-cell0-8850-account-create-update-bwmll\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.704991 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.730639 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18608c5-afda-4481-9c6d-a576dfd4d803-operator-scripts\") pod \"nova-cell1-8d66-account-create-update-qwzzn\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.730769 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg99s\" (UniqueName: \"kubernetes.io/projected/a18608c5-afda-4481-9c6d-a576dfd4d803-kube-api-access-rg99s\") pod \"nova-cell1-8d66-account-create-update-qwzzn\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.794047 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.833290 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18608c5-afda-4481-9c6d-a576dfd4d803-operator-scripts\") pod \"nova-cell1-8d66-account-create-update-qwzzn\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.833417 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg99s\" (UniqueName: \"kubernetes.io/projected/a18608c5-afda-4481-9c6d-a576dfd4d803-kube-api-access-rg99s\") pod \"nova-cell1-8d66-account-create-update-qwzzn\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.843789 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18608c5-afda-4481-9c6d-a576dfd4d803-operator-scripts\") pod \"nova-cell1-8d66-account-create-update-qwzzn\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.872042 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg99s\" (UniqueName: \"kubernetes.io/projected/a18608c5-afda-4481-9c6d-a576dfd4d803-kube-api-access-rg99s\") pod \"nova-cell1-8d66-account-create-update-qwzzn\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.912473 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-p9q28"] Jan 27 11:38:29 crc kubenswrapper[4775]: I0127 11:38:29.920530 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:29 crc kubenswrapper[4775]: W0127 11:38:29.935084 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod027dfac2_8504_46aa_9302_19df71441688.slice/crio-a5ea522bd09fbcecdc65ecabc0fa4a29a01352b93d9ae530d069999a7f1373c1 WatchSource:0}: Error finding container a5ea522bd09fbcecdc65ecabc0fa4a29a01352b93d9ae530d069999a7f1373c1: Status 404 returned error can't find the container with id a5ea522bd09fbcecdc65ecabc0fa4a29a01352b93d9ae530d069999a7f1373c1 Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.010830 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-k4m7t"] Jan 27 11:38:30 crc kubenswrapper[4775]: W0127 11:38:30.032909 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47764d9e_0435_43b7_aa95_e0a7e0d8b9c1.slice/crio-86b7c941d9f31eeece2b46db2ab6d463c1a4388701a1ca57f3c4d733d0767a36 WatchSource:0}: Error finding container 86b7c941d9f31eeece2b46db2ab6d463c1a4388701a1ca57f3c4d733d0767a36: Status 404 returned error can't find the container with id 86b7c941d9f31eeece2b46db2ab6d463c1a4388701a1ca57f3c4d733d0767a36 Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.070701 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k4m7t" event={"ID":"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1","Type":"ContainerStarted","Data":"86b7c941d9f31eeece2b46db2ab6d463c1a4388701a1ca57f3c4d733d0767a36"} Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.073784 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p9q28" event={"ID":"027dfac2-8504-46aa-9302-19df71441688","Type":"ContainerStarted","Data":"a5ea522bd09fbcecdc65ecabc0fa4a29a01352b93d9ae530d069999a7f1373c1"} Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.079957 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-55bc6945f7-5kkp2" event={"ID":"fe6b32f3-f53f-43ba-a349-2f00d5e657d0","Type":"ContainerStarted","Data":"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57"} Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.079998 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-55bc6945f7-5kkp2" event={"ID":"fe6b32f3-f53f-43ba-a349-2f00d5e657d0","Type":"ContainerStarted","Data":"fa5db8a5c7621b855f9aee7c911007cac93d44ed2023e821a1db694da3d675fa"} Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.178180 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-6423-account-create-update-h7gvh"] Jan 27 11:38:30 crc kubenswrapper[4775]: W0127 11:38:30.202563 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb03d69b1_c651_4b79_9ba1_581dc15737a6.slice/crio-8264f5ffac87ac50ec89fdbbebc334da39912586980294ccc0b8af77827b76cc WatchSource:0}: Error finding container 8264f5ffac87ac50ec89fdbbebc334da39912586980294ccc0b8af77827b76cc: Status 404 returned error can't find the container with id 8264f5ffac87ac50ec89fdbbebc334da39912586980294ccc0b8af77827b76cc Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.334399 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tfv9j"] Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.425074 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-8850-account-create-update-bwmll"] Jan 27 11:38:30 crc kubenswrapper[4775]: I0127 11:38:30.526764 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8d66-account-create-update-qwzzn"] Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.092987 4775 generic.go:334] "Generic (PLEG): container finished" podID="b03d69b1-c651-4b79-9ba1-581dc15737a6" containerID="4caff9acfbabff5d43e064a2dae71d1faf921323c384955f825a0b026f90243f" exitCode=0 Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.093250 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6423-account-create-update-h7gvh" event={"ID":"b03d69b1-c651-4b79-9ba1-581dc15737a6","Type":"ContainerDied","Data":"4caff9acfbabff5d43e064a2dae71d1faf921323c384955f825a0b026f90243f"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.093277 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6423-account-create-update-h7gvh" event={"ID":"b03d69b1-c651-4b79-9ba1-581dc15737a6","Type":"ContainerStarted","Data":"8264f5ffac87ac50ec89fdbbebc334da39912586980294ccc0b8af77827b76cc"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.100190 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8850-account-create-update-bwmll" event={"ID":"aeed29be-d561-4bf4-bdc1-c180e1983a3c","Type":"ContainerStarted","Data":"d58491ed7d0755e66bb744d214205950b95159736589ca24555561234307e146"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.102335 4775 generic.go:334] "Generic (PLEG): container finished" podID="d6287027-2778-4115-b173-62b1600d0247" containerID="7ef3f2b53db6801d250b8f062a4c055cb74eb877a306cd9ed1f923e6a13337a5" exitCode=0 Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.102400 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tfv9j" event={"ID":"d6287027-2778-4115-b173-62b1600d0247","Type":"ContainerDied","Data":"7ef3f2b53db6801d250b8f062a4c055cb74eb877a306cd9ed1f923e6a13337a5"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.102426 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tfv9j" event={"ID":"d6287027-2778-4115-b173-62b1600d0247","Type":"ContainerStarted","Data":"d47e760d11933ebe6cd27f4052b822d367fe82a52bdb02d413a13c9bc07bfd85"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.104945 4775 generic.go:334] "Generic (PLEG): container finished" podID="47764d9e-0435-43b7-aa95-e0a7e0d8b9c1" containerID="f1da3c93241fe74774825dab64f2ef30084cf90829cd29690c1d5d1e607b82cf" exitCode=0 Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.105022 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k4m7t" event={"ID":"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1","Type":"ContainerDied","Data":"f1da3c93241fe74774825dab64f2ef30084cf90829cd29690c1d5d1e607b82cf"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.108763 4775 generic.go:334] "Generic (PLEG): container finished" podID="027dfac2-8504-46aa-9302-19df71441688" containerID="a3091380a3b190141025c92d1747551aef9bfe0d5a0a8fe21ec59422863e92d3" exitCode=0 Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.108841 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p9q28" event={"ID":"027dfac2-8504-46aa-9302-19df71441688","Type":"ContainerDied","Data":"a3091380a3b190141025c92d1747551aef9bfe0d5a0a8fe21ec59422863e92d3"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.134727 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-55bc6945f7-5kkp2" event={"ID":"fe6b32f3-f53f-43ba-a349-2f00d5e657d0","Type":"ContainerStarted","Data":"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1"} Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.134993 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.135116 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:31 crc kubenswrapper[4775]: I0127 11:38:31.187629 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-55bc6945f7-5kkp2" podStartSLOduration=3.187611936 podStartE2EDuration="3.187611936s" podCreationTimestamp="2026-01-27 11:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:31.176952405 +0000 UTC m=+1090.318550182" watchObservedRunningTime="2026-01-27 11:38:31.187611936 +0000 UTC m=+1090.329209703" Jan 27 11:38:32 crc kubenswrapper[4775]: I0127 11:38:32.573768 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 27 11:38:33 crc kubenswrapper[4775]: I0127 11:38:33.154382 4775 generic.go:334] "Generic (PLEG): container finished" podID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerID="9df7ce8e17e4380ee4b7c55578b2dda866d82c6471224b6ea2cb8602d082c361" exitCode=0 Jan 27 11:38:33 crc kubenswrapper[4775]: I0127 11:38:33.154459 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerDied","Data":"9df7ce8e17e4380ee4b7c55578b2dda866d82c6471224b6ea2cb8602d082c361"} Jan 27 11:38:35 crc kubenswrapper[4775]: I0127 11:38:35.118248 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:35 crc kubenswrapper[4775]: I0127 11:38:35.119753 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="cinder-scheduler" containerID="cri-o://bbb066bf267b9b4c21870b464097c872ce5e07c929ddc57dfd10b2d4417b3e8c" gracePeriod=30 Jan 27 11:38:35 crc kubenswrapper[4775]: I0127 11:38:35.119849 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="probe" containerID="cri-o://42634da366d0324b3faac04253eb83641574ae12f3e9cc409177c836453b0cb7" gracePeriod=30 Jan 27 11:38:35 crc kubenswrapper[4775]: I0127 11:38:35.175407 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:35 crc kubenswrapper[4775]: I0127 11:38:35.175654 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api-log" containerID="cri-o://dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374" gracePeriod=30 Jan 27 11:38:35 crc kubenswrapper[4775]: I0127 11:38:35.175717 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api" containerID="cri-o://89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290" gracePeriod=30 Jan 27 11:38:35 crc kubenswrapper[4775]: I0127 11:38:35.255130 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.181131 4775 generic.go:334] "Generic (PLEG): container finished" podID="29838f60-9966-4962-9842-b6010abc1468" containerID="dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374" exitCode=143 Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.181194 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29838f60-9966-4962-9842-b6010abc1468","Type":"ContainerDied","Data":"dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374"} Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.183646 4775 generic.go:334] "Generic (PLEG): container finished" podID="7287c167-2d78-4766-b072-0762f4c4d504" containerID="42634da366d0324b3faac04253eb83641574ae12f3e9cc409177c836453b0cb7" exitCode=0 Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.183682 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7287c167-2d78-4766-b072-0762f4c4d504","Type":"ContainerDied","Data":"42634da366d0324b3faac04253eb83641574ae12f3e9cc409177c836453b0cb7"} Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.757952 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.845676 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.868052 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.882084 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.893377 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.898236 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6287027-2778-4115-b173-62b1600d0247-operator-scripts\") pod \"d6287027-2778-4115-b173-62b1600d0247\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.898338 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbhts\" (UniqueName: \"kubernetes.io/projected/d6287027-2778-4115-b173-62b1600d0247-kube-api-access-fbhts\") pod \"d6287027-2778-4115-b173-62b1600d0247\" (UID: \"d6287027-2778-4115-b173-62b1600d0247\") " Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.899401 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6287027-2778-4115-b173-62b1600d0247-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6287027-2778-4115-b173-62b1600d0247" (UID: "d6287027-2778-4115-b173-62b1600d0247"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:36 crc kubenswrapper[4775]: I0127 11:38:36.905923 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6287027-2778-4115-b173-62b1600d0247-kube-api-access-fbhts" (OuterVolumeSpecName: "kube-api-access-fbhts") pod "d6287027-2778-4115-b173-62b1600d0247" (UID: "d6287027-2778-4115-b173-62b1600d0247"). InnerVolumeSpecName "kube-api-access-fbhts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000369 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-config-data\") pod \"f43a36d6-24df-43c5-9d20-aaa35c11f855\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000493 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5szj\" (UniqueName: \"kubernetes.io/projected/f43a36d6-24df-43c5-9d20-aaa35c11f855-kube-api-access-t5szj\") pod \"f43a36d6-24df-43c5-9d20-aaa35c11f855\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000558 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/027dfac2-8504-46aa-9302-19df71441688-operator-scripts\") pod \"027dfac2-8504-46aa-9302-19df71441688\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000619 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcs7g\" (UniqueName: \"kubernetes.io/projected/b03d69b1-c651-4b79-9ba1-581dc15737a6-kube-api-access-kcs7g\") pod \"b03d69b1-c651-4b79-9ba1-581dc15737a6\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000648 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-run-httpd\") pod \"f43a36d6-24df-43c5-9d20-aaa35c11f855\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000687 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-combined-ca-bundle\") pod \"f43a36d6-24df-43c5-9d20-aaa35c11f855\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000730 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-sg-core-conf-yaml\") pod \"f43a36d6-24df-43c5-9d20-aaa35c11f855\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000761 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-scripts\") pod \"f43a36d6-24df-43c5-9d20-aaa35c11f855\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.000807 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-log-httpd\") pod \"f43a36d6-24df-43c5-9d20-aaa35c11f855\" (UID: \"f43a36d6-24df-43c5-9d20-aaa35c11f855\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.001003 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c27ln\" (UniqueName: \"kubernetes.io/projected/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-kube-api-access-c27ln\") pod \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.001073 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b03d69b1-c651-4b79-9ba1-581dc15737a6-operator-scripts\") pod \"b03d69b1-c651-4b79-9ba1-581dc15737a6\" (UID: \"b03d69b1-c651-4b79-9ba1-581dc15737a6\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.001120 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rbr9\" (UniqueName: \"kubernetes.io/projected/027dfac2-8504-46aa-9302-19df71441688-kube-api-access-6rbr9\") pod \"027dfac2-8504-46aa-9302-19df71441688\" (UID: \"027dfac2-8504-46aa-9302-19df71441688\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.001145 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/027dfac2-8504-46aa-9302-19df71441688-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "027dfac2-8504-46aa-9302-19df71441688" (UID: "027dfac2-8504-46aa-9302-19df71441688"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.001163 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-operator-scripts\") pod \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\" (UID: \"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1\") " Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.001581 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f43a36d6-24df-43c5-9d20-aaa35c11f855" (UID: "f43a36d6-24df-43c5-9d20-aaa35c11f855"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.001907 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f43a36d6-24df-43c5-9d20-aaa35c11f855" (UID: "f43a36d6-24df-43c5-9d20-aaa35c11f855"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.002224 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/027dfac2-8504-46aa-9302-19df71441688-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.002297 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.002357 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6287027-2778-4115-b173-62b1600d0247-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.002416 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbhts\" (UniqueName: \"kubernetes.io/projected/d6287027-2778-4115-b173-62b1600d0247-kube-api-access-fbhts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.002496 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f43a36d6-24df-43c5-9d20-aaa35c11f855-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.003926 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43a36d6-24df-43c5-9d20-aaa35c11f855-kube-api-access-t5szj" (OuterVolumeSpecName: "kube-api-access-t5szj") pod "f43a36d6-24df-43c5-9d20-aaa35c11f855" (UID: "f43a36d6-24df-43c5-9d20-aaa35c11f855"). InnerVolumeSpecName "kube-api-access-t5szj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.004822 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/027dfac2-8504-46aa-9302-19df71441688-kube-api-access-6rbr9" (OuterVolumeSpecName: "kube-api-access-6rbr9") pod "027dfac2-8504-46aa-9302-19df71441688" (UID: "027dfac2-8504-46aa-9302-19df71441688"). InnerVolumeSpecName "kube-api-access-6rbr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.005229 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b03d69b1-c651-4b79-9ba1-581dc15737a6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b03d69b1-c651-4b79-9ba1-581dc15737a6" (UID: "b03d69b1-c651-4b79-9ba1-581dc15737a6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.005587 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47764d9e-0435-43b7-aa95-e0a7e0d8b9c1" (UID: "47764d9e-0435-43b7-aa95-e0a7e0d8b9c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.006839 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-scripts" (OuterVolumeSpecName: "scripts") pod "f43a36d6-24df-43c5-9d20-aaa35c11f855" (UID: "f43a36d6-24df-43c5-9d20-aaa35c11f855"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.007182 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b03d69b1-c651-4b79-9ba1-581dc15737a6-kube-api-access-kcs7g" (OuterVolumeSpecName: "kube-api-access-kcs7g") pod "b03d69b1-c651-4b79-9ba1-581dc15737a6" (UID: "b03d69b1-c651-4b79-9ba1-581dc15737a6"). InnerVolumeSpecName "kube-api-access-kcs7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.007219 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-kube-api-access-c27ln" (OuterVolumeSpecName: "kube-api-access-c27ln") pod "47764d9e-0435-43b7-aa95-e0a7e0d8b9c1" (UID: "47764d9e-0435-43b7-aa95-e0a7e0d8b9c1"). InnerVolumeSpecName "kube-api-access-c27ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.037245 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f43a36d6-24df-43c5-9d20-aaa35c11f855" (UID: "f43a36d6-24df-43c5-9d20-aaa35c11f855"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.076565 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f43a36d6-24df-43c5-9d20-aaa35c11f855" (UID: "f43a36d6-24df-43c5-9d20-aaa35c11f855"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.097915 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-config-data" (OuterVolumeSpecName: "config-data") pod "f43a36d6-24df-43c5-9d20-aaa35c11f855" (UID: "f43a36d6-24df-43c5-9d20-aaa35c11f855"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103732 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103765 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103778 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103790 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c27ln\" (UniqueName: \"kubernetes.io/projected/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-kube-api-access-c27ln\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103804 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b03d69b1-c651-4b79-9ba1-581dc15737a6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103815 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rbr9\" (UniqueName: \"kubernetes.io/projected/027dfac2-8504-46aa-9302-19df71441688-kube-api-access-6rbr9\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103826 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103836 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f43a36d6-24df-43c5-9d20-aaa35c11f855-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103846 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5szj\" (UniqueName: \"kubernetes.io/projected/f43a36d6-24df-43c5-9d20-aaa35c11f855-kube-api-access-t5szj\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.103859 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcs7g\" (UniqueName: \"kubernetes.io/projected/b03d69b1-c651-4b79-9ba1-581dc15737a6-kube-api-access-kcs7g\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.194922 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-k4m7t" event={"ID":"47764d9e-0435-43b7-aa95-e0a7e0d8b9c1","Type":"ContainerDied","Data":"86b7c941d9f31eeece2b46db2ab6d463c1a4388701a1ca57f3c4d733d0767a36"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.194968 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86b7c941d9f31eeece2b46db2ab6d463c1a4388701a1ca57f3c4d733d0767a36" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.195031 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-k4m7t" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.200202 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-p9q28" event={"ID":"027dfac2-8504-46aa-9302-19df71441688","Type":"ContainerDied","Data":"a5ea522bd09fbcecdc65ecabc0fa4a29a01352b93d9ae530d069999a7f1373c1"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.200250 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5ea522bd09fbcecdc65ecabc0fa4a29a01352b93d9ae530d069999a7f1373c1" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.200223 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-p9q28" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.202001 4775 generic.go:334] "Generic (PLEG): container finished" podID="a18608c5-afda-4481-9c6d-a576dfd4d803" containerID="23b16c9948b130a40404980a7031b163bab9fc293057be41f8d97640f61ddc95" exitCode=0 Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.202089 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" event={"ID":"a18608c5-afda-4481-9c6d-a576dfd4d803","Type":"ContainerDied","Data":"23b16c9948b130a40404980a7031b163bab9fc293057be41f8d97640f61ddc95"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.202125 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" event={"ID":"a18608c5-afda-4481-9c6d-a576dfd4d803","Type":"ContainerStarted","Data":"3add38581e28b29cf3951dafb72991b84c4a1e1fef3b9052c7bf2dbc049b4e0c"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.204015 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-6423-account-create-update-h7gvh" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.204020 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-6423-account-create-update-h7gvh" event={"ID":"b03d69b1-c651-4b79-9ba1-581dc15737a6","Type":"ContainerDied","Data":"8264f5ffac87ac50ec89fdbbebc334da39912586980294ccc0b8af77827b76cc"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.204068 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8264f5ffac87ac50ec89fdbbebc334da39912586980294ccc0b8af77827b76cc" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.209916 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f43a36d6-24df-43c5-9d20-aaa35c11f855","Type":"ContainerDied","Data":"28b64b4fcdfe6c67d081958bb4e6c186a5ea1015e7bbc85f180d20d2234b064c"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.209976 4775 scope.go:117] "RemoveContainer" containerID="757f8cda3b6f903a401192990356764bd59a5026006946e21249f4fd71282e30" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.210099 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.221489 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"db40a4a8-ce91-40a6-8b63-ccc17ed327da","Type":"ContainerStarted","Data":"2fdf573c5dbc3537a484b71651345a941aea5fdf62703cfc9098c6d6cbdcf0dc"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.226944 4775 generic.go:334] "Generic (PLEG): container finished" podID="aeed29be-d561-4bf4-bdc1-c180e1983a3c" containerID="3e39eecfe6e3fc9edcef832aba89c2b8bb839bad8f9d02052e6eb7c6e0e5266b" exitCode=0 Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.227022 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8850-account-create-update-bwmll" event={"ID":"aeed29be-d561-4bf4-bdc1-c180e1983a3c","Type":"ContainerDied","Data":"3e39eecfe6e3fc9edcef832aba89c2b8bb839bad8f9d02052e6eb7c6e0e5266b"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.233022 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tfv9j" event={"ID":"d6287027-2778-4115-b173-62b1600d0247","Type":"ContainerDied","Data":"d47e760d11933ebe6cd27f4052b822d367fe82a52bdb02d413a13c9bc07bfd85"} Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.233069 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d47e760d11933ebe6cd27f4052b822d367fe82a52bdb02d413a13c9bc07bfd85" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.233117 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tfv9j" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.262577 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.691530685 podStartE2EDuration="13.261252177s" podCreationTimestamp="2026-01-27 11:38:24 +0000 UTC" firstStartedPulling="2026-01-27 11:38:24.962343238 +0000 UTC m=+1084.103941015" lastFinishedPulling="2026-01-27 11:38:36.53206473 +0000 UTC m=+1095.673662507" observedRunningTime="2026-01-27 11:38:37.259230462 +0000 UTC m=+1096.400828249" watchObservedRunningTime="2026-01-27 11:38:37.261252177 +0000 UTC m=+1096.402849954" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.293311 4775 scope.go:117] "RemoveContainer" containerID="004b1d31e12b92a12b6611a9cd3172251cdec0a27132ad6e5347a1433fe5b67a" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.343869 4775 scope.go:117] "RemoveContainer" containerID="9df7ce8e17e4380ee4b7c55578b2dda866d82c6471224b6ea2cb8602d082c361" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.369495 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.375415 4775 scope.go:117] "RemoveContainer" containerID="c9b6a0c545f10363ab83ee451af24f75b0c3422868d2657358c693fd0f9f4e66" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.397553 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412042 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412439 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6287027-2778-4115-b173-62b1600d0247" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412473 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6287027-2778-4115-b173-62b1600d0247" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412484 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-notification-agent" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412491 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-notification-agent" Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412501 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="sg-core" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412506 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="sg-core" Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412571 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="027dfac2-8504-46aa-9302-19df71441688" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412579 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="027dfac2-8504-46aa-9302-19df71441688" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412630 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="proxy-httpd" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412636 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="proxy-httpd" Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412647 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b03d69b1-c651-4b79-9ba1-581dc15737a6" containerName="mariadb-account-create-update" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412653 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b03d69b1-c651-4b79-9ba1-581dc15737a6" containerName="mariadb-account-create-update" Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412666 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-central-agent" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412672 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-central-agent" Jan 27 11:38:37 crc kubenswrapper[4775]: E0127 11:38:37.412680 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47764d9e-0435-43b7-aa95-e0a7e0d8b9c1" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412686 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="47764d9e-0435-43b7-aa95-e0a7e0d8b9c1" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412915 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6287027-2778-4115-b173-62b1600d0247" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412932 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-notification-agent" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412944 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="47764d9e-0435-43b7-aa95-e0a7e0d8b9c1" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412956 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b03d69b1-c651-4b79-9ba1-581dc15737a6" containerName="mariadb-account-create-update" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412970 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="027dfac2-8504-46aa-9302-19df71441688" containerName="mariadb-database-create" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412980 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="proxy-httpd" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412989 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="ceilometer-central-agent" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.412997 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" containerName="sg-core" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.415662 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.417968 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.418743 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.434506 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.511977 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.512014 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9z42\" (UniqueName: \"kubernetes.io/projected/093516f4-3b85-4290-98c0-006f41e91129-kube-api-access-s9z42\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.512048 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-scripts\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.512117 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-config-data\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.512135 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-run-httpd\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.512163 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.512187 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-log-httpd\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.613353 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-config-data\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.613404 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-run-httpd\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.613465 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.613507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-log-httpd\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.613597 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.613621 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9z42\" (UniqueName: \"kubernetes.io/projected/093516f4-3b85-4290-98c0-006f41e91129-kube-api-access-s9z42\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.613661 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-scripts\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.615024 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-log-httpd\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.615894 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-run-httpd\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.618233 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.619208 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.627860 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-config-data\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.630481 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-scripts\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.633260 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9z42\" (UniqueName: \"kubernetes.io/projected/093516f4-3b85-4290-98c0-006f41e91129-kube-api-access-s9z42\") pod \"ceilometer-0\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " pod="openstack/ceilometer-0" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.754954 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f43a36d6-24df-43c5-9d20-aaa35c11f855" path="/var/lib/kubelet/pods/f43a36d6-24df-43c5-9d20-aaa35c11f855/volumes" Jan 27 11:38:37 crc kubenswrapper[4775]: I0127 11:38:37.777008 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.125792 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.295234 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:38 crc kubenswrapper[4775]: W0127 11:38:38.356870 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod093516f4_3b85_4290_98c0_006f41e91129.slice/crio-6977770effb03f1e311c752b3dcdf9fe577bbfd90a405744eb50e6760440fc9d WatchSource:0}: Error finding container 6977770effb03f1e311c752b3dcdf9fe577bbfd90a405744eb50e6760440fc9d: Status 404 returned error can't find the container with id 6977770effb03f1e311c752b3dcdf9fe577bbfd90a405744eb50e6760440fc9d Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.737753 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.746714 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.752271 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.757664 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.827881 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66f4cff584-s28fg"] Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.828520 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66f4cff584-s28fg" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-api" containerID="cri-o://4940cda0a55ac3bfa8b35deb3e51723cf26072d3cd145374c8d469bfb275193d" gracePeriod=30 Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.828678 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66f4cff584-s28fg" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-httpd" containerID="cri-o://a7a6a0a041650648d435f425352e57c5d669972574c1edc44a04c82383216931" gracePeriod=30 Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832045 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18608c5-afda-4481-9c6d-a576dfd4d803-operator-scripts\") pod \"a18608c5-afda-4481-9c6d-a576dfd4d803\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832094 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832133 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9wz5s\" (UniqueName: \"kubernetes.io/projected/29838f60-9966-4962-9842-b6010abc1468-kube-api-access-9wz5s\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832306 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-public-tls-certs\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832338 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data-custom\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832362 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-combined-ca-bundle\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832384 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29838f60-9966-4962-9842-b6010abc1468-etc-machine-id\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832464 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-internal-tls-certs\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832495 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sb9c\" (UniqueName: \"kubernetes.io/projected/aeed29be-d561-4bf4-bdc1-c180e1983a3c-kube-api-access-7sb9c\") pod \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832522 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeed29be-d561-4bf4-bdc1-c180e1983a3c-operator-scripts\") pod \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\" (UID: \"aeed29be-d561-4bf4-bdc1-c180e1983a3c\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832557 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg99s\" (UniqueName: \"kubernetes.io/projected/a18608c5-afda-4481-9c6d-a576dfd4d803-kube-api-access-rg99s\") pod \"a18608c5-afda-4481-9c6d-a576dfd4d803\" (UID: \"a18608c5-afda-4481-9c6d-a576dfd4d803\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832595 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-scripts\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.832646 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29838f60-9966-4962-9842-b6010abc1468-logs\") pod \"29838f60-9966-4962-9842-b6010abc1468\" (UID: \"29838f60-9966-4962-9842-b6010abc1468\") " Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.834653 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29838f60-9966-4962-9842-b6010abc1468-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.835357 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeed29be-d561-4bf4-bdc1-c180e1983a3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aeed29be-d561-4bf4-bdc1-c180e1983a3c" (UID: "aeed29be-d561-4bf4-bdc1-c180e1983a3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.839923 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.843551 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.844556 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29838f60-9966-4962-9842-b6010abc1468-logs" (OuterVolumeSpecName: "logs") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.844798 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18608c5-afda-4481-9c6d-a576dfd4d803-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a18608c5-afda-4481-9c6d-a576dfd4d803" (UID: "a18608c5-afda-4481-9c6d-a576dfd4d803"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.865488 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.867822 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a18608c5-afda-4481-9c6d-a576dfd4d803-kube-api-access-rg99s" (OuterVolumeSpecName: "kube-api-access-rg99s") pod "a18608c5-afda-4481-9c6d-a576dfd4d803" (UID: "a18608c5-afda-4481-9c6d-a576dfd4d803"). InnerVolumeSpecName "kube-api-access-rg99s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.870603 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-scripts" (OuterVolumeSpecName: "scripts") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.874071 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29838f60-9966-4962-9842-b6010abc1468-kube-api-access-9wz5s" (OuterVolumeSpecName: "kube-api-access-9wz5s") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "kube-api-access-9wz5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.875640 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeed29be-d561-4bf4-bdc1-c180e1983a3c-kube-api-access-7sb9c" (OuterVolumeSpecName: "kube-api-access-7sb9c") pod "aeed29be-d561-4bf4-bdc1-c180e1983a3c" (UID: "aeed29be-d561-4bf4-bdc1-c180e1983a3c"). InnerVolumeSpecName "kube-api-access-7sb9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.947958 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sb9c\" (UniqueName: \"kubernetes.io/projected/aeed29be-d561-4bf4-bdc1-c180e1983a3c-kube-api-access-7sb9c\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.947987 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aeed29be-d561-4bf4-bdc1-c180e1983a3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.947997 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg99s\" (UniqueName: \"kubernetes.io/projected/a18608c5-afda-4481-9c6d-a576dfd4d803-kube-api-access-rg99s\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.948011 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.948021 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29838f60-9966-4962-9842-b6010abc1468-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.948030 4775 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a18608c5-afda-4481-9c6d-a576dfd4d803-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.948043 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9wz5s\" (UniqueName: \"kubernetes.io/projected/29838f60-9966-4962-9842-b6010abc1468-kube-api-access-9wz5s\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.948188 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.948198 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/29838f60-9966-4962-9842-b6010abc1468-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:38 crc kubenswrapper[4775]: I0127 11:38:38.979896 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.009154 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.010935 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data" (OuterVolumeSpecName: "config-data") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.045756 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "29838f60-9966-4962-9842-b6010abc1468" (UID: "29838f60-9966-4962-9842-b6010abc1468"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.051032 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.051058 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.051068 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.051076 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29838f60-9966-4962-9842-b6010abc1468-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.250404 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-8850-account-create-update-bwmll" event={"ID":"aeed29be-d561-4bf4-bdc1-c180e1983a3c","Type":"ContainerDied","Data":"d58491ed7d0755e66bb744d214205950b95159736589ca24555561234307e146"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.250683 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d58491ed7d0755e66bb744d214205950b95159736589ca24555561234307e146" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.250428 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-8850-account-create-update-bwmll" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.255758 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerStarted","Data":"9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.255790 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerStarted","Data":"6977770effb03f1e311c752b3dcdf9fe577bbfd90a405744eb50e6760440fc9d"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.258084 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerID="a7a6a0a041650648d435f425352e57c5d669972574c1edc44a04c82383216931" exitCode=0 Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.258136 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66f4cff584-s28fg" event={"ID":"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed","Type":"ContainerDied","Data":"a7a6a0a041650648d435f425352e57c5d669972574c1edc44a04c82383216931"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.260727 4775 generic.go:334] "Generic (PLEG): container finished" podID="7287c167-2d78-4766-b072-0762f4c4d504" containerID="bbb066bf267b9b4c21870b464097c872ce5e07c929ddc57dfd10b2d4417b3e8c" exitCode=0 Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.260773 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7287c167-2d78-4766-b072-0762f4c4d504","Type":"ContainerDied","Data":"bbb066bf267b9b4c21870b464097c872ce5e07c929ddc57dfd10b2d4417b3e8c"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.262747 4775 generic.go:334] "Generic (PLEG): container finished" podID="29838f60-9966-4962-9842-b6010abc1468" containerID="89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290" exitCode=0 Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.262817 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29838f60-9966-4962-9842-b6010abc1468","Type":"ContainerDied","Data":"89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.262846 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"29838f60-9966-4962-9842-b6010abc1468","Type":"ContainerDied","Data":"33dee4dc93223d68ed0c9843e6651623dd7c73f98dd4eee5700b9bc73cb6734c"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.262852 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.262863 4775 scope.go:117] "RemoveContainer" containerID="89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.279998 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.282318 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8d66-account-create-update-qwzzn" event={"ID":"a18608c5-afda-4481-9c6d-a576dfd4d803","Type":"ContainerDied","Data":"3add38581e28b29cf3951dafb72991b84c4a1e1fef3b9052c7bf2dbc049b4e0c"} Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.282374 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3add38581e28b29cf3951dafb72991b84c4a1e1fef3b9052c7bf2dbc049b4e0c" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.296580 4775 scope.go:117] "RemoveContainer" containerID="dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.311541 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.324940 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.340389 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:39 crc kubenswrapper[4775]: E0127 11:38:39.341198 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.341281 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api" Jan 27 11:38:39 crc kubenswrapper[4775]: E0127 11:38:39.341344 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api-log" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.341401 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api-log" Jan 27 11:38:39 crc kubenswrapper[4775]: E0127 11:38:39.341487 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a18608c5-afda-4481-9c6d-a576dfd4d803" containerName="mariadb-account-create-update" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.341587 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18608c5-afda-4481-9c6d-a576dfd4d803" containerName="mariadb-account-create-update" Jan 27 11:38:39 crc kubenswrapper[4775]: E0127 11:38:39.341678 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeed29be-d561-4bf4-bdc1-c180e1983a3c" containerName="mariadb-account-create-update" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.341747 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeed29be-d561-4bf4-bdc1-c180e1983a3c" containerName="mariadb-account-create-update" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.342014 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeed29be-d561-4bf4-bdc1-c180e1983a3c" containerName="mariadb-account-create-update" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.342285 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.342355 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="29838f60-9966-4962-9842-b6010abc1468" containerName="cinder-api-log" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.342437 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a18608c5-afda-4481-9c6d-a576dfd4d803" containerName="mariadb-account-create-update" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.343620 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.347020 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.351179 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.352509 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.354149 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.354550 4775 scope.go:117] "RemoveContainer" containerID="89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290" Jan 27 11:38:39 crc kubenswrapper[4775]: E0127 11:38:39.355035 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290\": container with ID starting with 89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290 not found: ID does not exist" containerID="89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.355062 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290"} err="failed to get container status \"89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290\": rpc error: code = NotFound desc = could not find container \"89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290\": container with ID starting with 89398b17c90c3c2cbf2cc335ddc60ea9977eeb1527ef76a633158952880e6290 not found: ID does not exist" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.355082 4775 scope.go:117] "RemoveContainer" containerID="dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374" Jan 27 11:38:39 crc kubenswrapper[4775]: E0127 11:38:39.356402 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374\": container with ID starting with dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374 not found: ID does not exist" containerID="dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.356418 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374"} err="failed to get container status \"dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374\": rpc error: code = NotFound desc = could not find container \"dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374\": container with ID starting with dd4908a9bb1e010ebe91169b855950e11413c7957c9fa828f7e348c1b5761374 not found: ID does not exist" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.457830 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d670312-cbe8-44de-8f6f-857772d2af05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.457879 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.457923 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.457957 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.457988 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-config-data\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.458050 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-scripts\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.458091 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d670312-cbe8-44de-8f6f-857772d2af05-logs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.458140 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8trvc\" (UniqueName: \"kubernetes.io/projected/3d670312-cbe8-44de-8f6f-857772d2af05-kube-api-access-8trvc\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.458221 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560400 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560491 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560521 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-config-data\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560548 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-scripts\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560577 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d670312-cbe8-44de-8f6f-857772d2af05-logs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560614 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8trvc\" (UniqueName: \"kubernetes.io/projected/3d670312-cbe8-44de-8f6f-857772d2af05-kube-api-access-8trvc\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560654 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560695 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d670312-cbe8-44de-8f6f-857772d2af05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.560712 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.573360 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d670312-cbe8-44de-8f6f-857772d2af05-logs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.574108 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3d670312-cbe8-44de-8f6f-857772d2af05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.578421 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.583091 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.583911 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-config-data\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.584288 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-public-tls-certs\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.586960 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-config-data-custom\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.596535 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3d670312-cbe8-44de-8f6f-857772d2af05-scripts\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.598394 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8trvc\" (UniqueName: \"kubernetes.io/projected/3d670312-cbe8-44de-8f6f-857772d2af05-kube-api-access-8trvc\") pod \"cinder-api-0\" (UID: \"3d670312-cbe8-44de-8f6f-857772d2af05\") " pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.672775 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.758398 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29838f60-9966-4962-9842-b6010abc1468" path="/var/lib/kubelet/pods/29838f60-9966-4962-9842-b6010abc1468/volumes" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.820193 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.969376 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data-custom\") pod \"7287c167-2d78-4766-b072-0762f4c4d504\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.969900 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data\") pod \"7287c167-2d78-4766-b072-0762f4c4d504\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.970095 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctt6g\" (UniqueName: \"kubernetes.io/projected/7287c167-2d78-4766-b072-0762f4c4d504-kube-api-access-ctt6g\") pod \"7287c167-2d78-4766-b072-0762f4c4d504\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.970299 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7287c167-2d78-4766-b072-0762f4c4d504-etc-machine-id\") pod \"7287c167-2d78-4766-b072-0762f4c4d504\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.970378 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-scripts\") pod \"7287c167-2d78-4766-b072-0762f4c4d504\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.970421 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-combined-ca-bundle\") pod \"7287c167-2d78-4766-b072-0762f4c4d504\" (UID: \"7287c167-2d78-4766-b072-0762f4c4d504\") " Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.971403 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7287c167-2d78-4766-b072-0762f4c4d504-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7287c167-2d78-4766-b072-0762f4c4d504" (UID: "7287c167-2d78-4766-b072-0762f4c4d504"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.982344 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7287c167-2d78-4766-b072-0762f4c4d504-kube-api-access-ctt6g" (OuterVolumeSpecName: "kube-api-access-ctt6g") pod "7287c167-2d78-4766-b072-0762f4c4d504" (UID: "7287c167-2d78-4766-b072-0762f4c4d504"). InnerVolumeSpecName "kube-api-access-ctt6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.982346 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-scripts" (OuterVolumeSpecName: "scripts") pod "7287c167-2d78-4766-b072-0762f4c4d504" (UID: "7287c167-2d78-4766-b072-0762f4c4d504"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:39 crc kubenswrapper[4775]: I0127 11:38:39.982706 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7287c167-2d78-4766-b072-0762f4c4d504" (UID: "7287c167-2d78-4766-b072-0762f4c4d504"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.052199 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7287c167-2d78-4766-b072-0762f4c4d504" (UID: "7287c167-2d78-4766-b072-0762f4c4d504"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.072210 4775 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7287c167-2d78-4766-b072-0762f4c4d504-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.072244 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.072253 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.072263 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.072272 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctt6g\" (UniqueName: \"kubernetes.io/projected/7287c167-2d78-4766-b072-0762f4c4d504-kube-api-access-ctt6g\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.104960 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data" (OuterVolumeSpecName: "config-data") pod "7287c167-2d78-4766-b072-0762f4c4d504" (UID: "7287c167-2d78-4766-b072-0762f4c4d504"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.163211 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.173946 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7287c167-2d78-4766-b072-0762f4c4d504-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.301434 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7287c167-2d78-4766-b072-0762f4c4d504","Type":"ContainerDied","Data":"bd8ae068132a0cde3e52d3eb2417624a42f137d7a8867511b824433d3a994398"} Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.301497 4775 scope.go:117] "RemoveContainer" containerID="42634da366d0324b3faac04253eb83641574ae12f3e9cc409177c836453b0cb7" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.301458 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.314850 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d670312-cbe8-44de-8f6f-857772d2af05","Type":"ContainerStarted","Data":"6eb38ee4612f38acec5d852feb5e8b04f871568e33480d91f8e13c85951dfadd"} Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.323240 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerStarted","Data":"24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c"} Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.334595 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.345298 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.360105 4775 scope.go:117] "RemoveContainer" containerID="bbb066bf267b9b4c21870b464097c872ce5e07c929ddc57dfd10b2d4417b3e8c" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.370689 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:40 crc kubenswrapper[4775]: E0127 11:38:40.371107 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="cinder-scheduler" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.371118 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="cinder-scheduler" Jan 27 11:38:40 crc kubenswrapper[4775]: E0127 11:38:40.371141 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="probe" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.371147 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="probe" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.371311 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="cinder-scheduler" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.371323 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7287c167-2d78-4766-b072-0762f4c4d504" containerName="probe" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.372223 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.380614 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.381076 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.478354 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.478585 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/030ef7f1-5f79-42e9-800e-55c4f70964e5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.478788 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.478873 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnc5k\" (UniqueName: \"kubernetes.io/projected/030ef7f1-5f79-42e9-800e-55c4f70964e5-kube-api-access-dnc5k\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.478964 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-config-data\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.479045 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-scripts\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.584184 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.584249 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnc5k\" (UniqueName: \"kubernetes.io/projected/030ef7f1-5f79-42e9-800e-55c4f70964e5-kube-api-access-dnc5k\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.584310 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-config-data\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.584639 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-scripts\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.585396 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.585436 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/030ef7f1-5f79-42e9-800e-55c4f70964e5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.585614 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/030ef7f1-5f79-42e9-800e-55c4f70964e5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.590200 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.591379 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-scripts\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.598799 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.600609 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/030ef7f1-5f79-42e9-800e-55c4f70964e5-config-data\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.608094 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnc5k\" (UniqueName: \"kubernetes.io/projected/030ef7f1-5f79-42e9-800e-55c4f70964e5-kube-api-access-dnc5k\") pod \"cinder-scheduler-0\" (UID: \"030ef7f1-5f79-42e9-800e-55c4f70964e5\") " pod="openstack/cinder-scheduler-0" Jan 27 11:38:40 crc kubenswrapper[4775]: I0127 11:38:40.749506 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 27 11:38:41 crc kubenswrapper[4775]: I0127 11:38:41.306714 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 27 11:38:41 crc kubenswrapper[4775]: W0127 11:38:41.318492 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod030ef7f1_5f79_42e9_800e_55c4f70964e5.slice/crio-9434333b7abc74560277574b36915b6dbdaf400b87b052334936ee09c6619e3d WatchSource:0}: Error finding container 9434333b7abc74560277574b36915b6dbdaf400b87b052334936ee09c6619e3d: Status 404 returned error can't find the container with id 9434333b7abc74560277574b36915b6dbdaf400b87b052334936ee09c6619e3d Jan 27 11:38:41 crc kubenswrapper[4775]: I0127 11:38:41.363988 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerStarted","Data":"43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88"} Jan 27 11:38:41 crc kubenswrapper[4775]: I0127 11:38:41.367144 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"030ef7f1-5f79-42e9-800e-55c4f70964e5","Type":"ContainerStarted","Data":"9434333b7abc74560277574b36915b6dbdaf400b87b052334936ee09c6619e3d"} Jan 27 11:38:41 crc kubenswrapper[4775]: I0127 11:38:41.369881 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d670312-cbe8-44de-8f6f-857772d2af05","Type":"ContainerStarted","Data":"58702b3569f40cd06e51a8afa2d27fff5a3e1b8bcd993716870b1819390a0075"} Jan 27 11:38:41 crc kubenswrapper[4775]: I0127 11:38:41.761746 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7287c167-2d78-4766-b072-0762f4c4d504" path="/var/lib/kubelet/pods/7287c167-2d78-4766-b072-0762f4c4d504/volumes" Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.397873 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"030ef7f1-5f79-42e9-800e-55c4f70964e5","Type":"ContainerStarted","Data":"f1971861217a880e224af0a1c5a5cefad7b0c93edf4ecdc4d5d6b7ea42934acd"} Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.411597 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"3d670312-cbe8-44de-8f6f-857772d2af05","Type":"ContainerStarted","Data":"538331ab0712b0add08c5b63361b93aed077ba69f583f33736f2ce481499e323"} Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.411718 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.431883 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.431867083 podStartE2EDuration="3.431867083s" podCreationTimestamp="2026-01-27 11:38:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:42.425606902 +0000 UTC m=+1101.567204689" watchObservedRunningTime="2026-01-27 11:38:42.431867083 +0000 UTC m=+1101.573464860" Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.437684 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerStarted","Data":"7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32"} Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.437853 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-central-agent" containerID="cri-o://9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46" gracePeriod=30 Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.437960 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.438263 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="proxy-httpd" containerID="cri-o://7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32" gracePeriod=30 Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.438304 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="sg-core" containerID="cri-o://43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88" gracePeriod=30 Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.438340 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-notification-agent" containerID="cri-o://24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c" gracePeriod=30 Jan 27 11:38:42 crc kubenswrapper[4775]: I0127 11:38:42.457077 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.949702099 podStartE2EDuration="5.457063744s" podCreationTimestamp="2026-01-27 11:38:37 +0000 UTC" firstStartedPulling="2026-01-27 11:38:38.369722785 +0000 UTC m=+1097.511320562" lastFinishedPulling="2026-01-27 11:38:41.87708444 +0000 UTC m=+1101.018682207" observedRunningTime="2026-01-27 11:38:42.456227061 +0000 UTC m=+1101.597824838" watchObservedRunningTime="2026-01-27 11:38:42.457063744 +0000 UTC m=+1101.598661521" Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.449609 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"030ef7f1-5f79-42e9-800e-55c4f70964e5","Type":"ContainerStarted","Data":"f8c39efe4d2dcc34b627df41a48948ef3ee1ac1a734fb7f5acd3072575dd0fc5"} Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.458063 4775 generic.go:334] "Generic (PLEG): container finished" podID="093516f4-3b85-4290-98c0-006f41e91129" containerID="7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32" exitCode=0 Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.458107 4775 generic.go:334] "Generic (PLEG): container finished" podID="093516f4-3b85-4290-98c0-006f41e91129" containerID="43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88" exitCode=2 Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.458121 4775 generic.go:334] "Generic (PLEG): container finished" podID="093516f4-3b85-4290-98c0-006f41e91129" containerID="24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c" exitCode=0 Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.459186 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerDied","Data":"7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32"} Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.459227 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerDied","Data":"43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88"} Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.459242 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerDied","Data":"24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c"} Jan 27 11:38:43 crc kubenswrapper[4775]: I0127 11:38:43.477242 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.477224951 podStartE2EDuration="3.477224951s" podCreationTimestamp="2026-01-27 11:38:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:43.475817402 +0000 UTC m=+1102.617415189" watchObservedRunningTime="2026-01-27 11:38:43.477224951 +0000 UTC m=+1102.618822728" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.472224 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerID="4940cda0a55ac3bfa8b35deb3e51723cf26072d3cd145374c8d469bfb275193d" exitCode=0 Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.472287 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66f4cff584-s28fg" event={"ID":"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed","Type":"ContainerDied","Data":"4940cda0a55ac3bfa8b35deb3e51723cf26072d3cd145374c8d469bfb275193d"} Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.853287 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6bh7g"] Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.854616 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.856250 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.856578 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kp5gz" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.864042 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.866706 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6bh7g"] Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.871193 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976319 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-combined-ca-bundle\") pod \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976365 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxk8k\" (UniqueName: \"kubernetes.io/projected/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-kube-api-access-xxk8k\") pod \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976474 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-httpd-config\") pod \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976508 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-config\") pod \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976608 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-ovndb-tls-certs\") pod \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\" (UID: \"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed\") " Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976817 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-config-data\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976855 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976899 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-scripts\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.976980 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbw4n\" (UniqueName: \"kubernetes.io/projected/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-kube-api-access-vbw4n\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:44 crc kubenswrapper[4775]: I0127 11:38:44.996003 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-kube-api-access-xxk8k" (OuterVolumeSpecName: "kube-api-access-xxk8k") pod "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" (UID: "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed"). InnerVolumeSpecName "kube-api-access-xxk8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.010004 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" (UID: "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.043915 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-config" (OuterVolumeSpecName: "config") pod "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" (UID: "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.046674 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" (UID: "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.076402 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" (UID: "f4a91fe0-4cd5-496b-9acd-7b874a2c3bed"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078311 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-config-data\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078372 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078418 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-scripts\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078521 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbw4n\" (UniqueName: \"kubernetes.io/projected/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-kube-api-access-vbw4n\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078584 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078603 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078614 4775 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078626 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.078635 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxk8k\" (UniqueName: \"kubernetes.io/projected/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed-kube-api-access-xxk8k\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.081594 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-scripts\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.081941 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-config-data\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.082964 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.094784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbw4n\" (UniqueName: \"kubernetes.io/projected/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-kube-api-access-vbw4n\") pod \"nova-cell0-conductor-db-sync-6bh7g\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.181521 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.253558 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.253732 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.481687 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66f4cff584-s28fg" event={"ID":"f4a91fe0-4cd5-496b-9acd-7b874a2c3bed","Type":"ContainerDied","Data":"98a20e3bbe057f1a1083416d0cff14282fdc9e2fca7261f4540fdf9a82145994"} Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.481766 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66f4cff584-s28fg" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.482031 4775 scope.go:117] "RemoveContainer" containerID="a7a6a0a041650648d435f425352e57c5d669972574c1edc44a04c82383216931" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.531110 4775 scope.go:117] "RemoveContainer" containerID="4940cda0a55ac3bfa8b35deb3e51723cf26072d3cd145374c8d469bfb275193d" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.533830 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66f4cff584-s28fg"] Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.545160 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-66f4cff584-s28fg"] Jan 27 11:38:45 crc kubenswrapper[4775]: W0127 11:38:45.629310 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b5e7b0a_a4d0_4c64_b273_2b47230efd17.slice/crio-d9265de06875404ccb8f671d76f819216e0e3fe1c45b67872bef4047a61868b0 WatchSource:0}: Error finding container d9265de06875404ccb8f671d76f819216e0e3fe1c45b67872bef4047a61868b0: Status 404 returned error can't find the container with id d9265de06875404ccb8f671d76f819216e0e3fe1c45b67872bef4047a61868b0 Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.631781 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6bh7g"] Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.757529 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" path="/var/lib/kubelet/pods/f4a91fe0-4cd5-496b-9acd-7b874a2c3bed/volumes" Jan 27 11:38:45 crc kubenswrapper[4775]: I0127 11:38:45.758763 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 27 11:38:46 crc kubenswrapper[4775]: I0127 11:38:46.501263 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" event={"ID":"1b5e7b0a-a4d0-4c64-b273-2b47230efd17","Type":"ContainerStarted","Data":"d9265de06875404ccb8f671d76f819216e0e3fe1c45b67872bef4047a61868b0"} Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.074793 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-66648b46df-hskmp"] Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.075167 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-api" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.075182 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-api" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.075198 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-httpd" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.075205 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-httpd" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.075368 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-api" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.075385 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a91fe0-4cd5-496b-9acd-7b874a2c3bed" containerName="neutron-httpd" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.076255 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.096231 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-66648b46df-hskmp"] Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.101808 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.231744 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-scripts\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.231814 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-run-httpd\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.231956 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-sg-core-conf-yaml\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.231985 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-log-httpd\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232003 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9z42\" (UniqueName: \"kubernetes.io/projected/093516f4-3b85-4290-98c0-006f41e91129-kube-api-access-s9z42\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232032 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-config-data\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232070 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232304 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-internal-tls-certs\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232367 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x89qz\" (UniqueName: \"kubernetes.io/projected/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-kube-api-access-x89qz\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232390 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-run-httpd\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232410 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-combined-ca-bundle\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232466 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-config-data\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232506 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-etc-swift\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232526 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-public-tls-certs\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.232574 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-log-httpd\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.233425 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.234349 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.238600 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-scripts" (OuterVolumeSpecName: "scripts") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.238783 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/093516f4-3b85-4290-98c0-006f41e91129-kube-api-access-s9z42" (OuterVolumeSpecName: "kube-api-access-s9z42") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "kube-api-access-s9z42". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.262851 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.332965 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334042 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle\") pod \"093516f4-3b85-4290-98c0-006f41e91129\" (UID: \"093516f4-3b85-4290-98c0-006f41e91129\") " Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334277 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-config-data\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334327 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-etc-swift\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334352 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-public-tls-certs\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334412 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-log-httpd\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334465 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-internal-tls-certs\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334518 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x89qz\" (UniqueName: \"kubernetes.io/projected/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-kube-api-access-x89qz\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334542 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-run-httpd\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334561 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-combined-ca-bundle\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334637 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334649 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334658 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9z42\" (UniqueName: \"kubernetes.io/projected/093516f4-3b85-4290-98c0-006f41e91129-kube-api-access-s9z42\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334668 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.334676 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/093516f4-3b85-4290-98c0-006f41e91129-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.335829 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-log-httpd\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: W0127 11:38:47.337135 4775 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/093516f4-3b85-4290-98c0-006f41e91129/volumes/kubernetes.io~secret/combined-ca-bundle Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.337181 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.338863 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-run-httpd\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.339853 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-internal-tls-certs\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.339892 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-combined-ca-bundle\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.341441 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-config-data\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.343466 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-etc-swift\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.348985 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-public-tls-certs\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.354480 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x89qz\" (UniqueName: \"kubernetes.io/projected/e22ddb6f-e33b-41ea-a24f-c97c0676e6e5-kube-api-access-x89qz\") pod \"swift-proxy-66648b46df-hskmp\" (UID: \"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5\") " pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.371612 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-config-data" (OuterVolumeSpecName: "config-data") pod "093516f4-3b85-4290-98c0-006f41e91129" (UID: "093516f4-3b85-4290-98c0-006f41e91129"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.406682 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.438873 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.438914 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/093516f4-3b85-4290-98c0-006f41e91129-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.520189 4775 generic.go:334] "Generic (PLEG): container finished" podID="093516f4-3b85-4290-98c0-006f41e91129" containerID="9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46" exitCode=0 Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.520228 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerDied","Data":"9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46"} Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.520253 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"093516f4-3b85-4290-98c0-006f41e91129","Type":"ContainerDied","Data":"6977770effb03f1e311c752b3dcdf9fe577bbfd90a405744eb50e6760440fc9d"} Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.520269 4775 scope.go:117] "RemoveContainer" containerID="7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.520269 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.552935 4775 scope.go:117] "RemoveContainer" containerID="43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.562363 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.577819 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586015 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.586392 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-central-agent" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586407 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-central-agent" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.586435 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-notification-agent" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586441 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-notification-agent" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.586470 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="proxy-httpd" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586477 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="proxy-httpd" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.586487 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="sg-core" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586493 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="sg-core" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586643 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="proxy-httpd" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586655 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-notification-agent" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586672 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="ceilometer-central-agent" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.586681 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="093516f4-3b85-4290-98c0-006f41e91129" containerName="sg-core" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.588273 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.592300 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.592643 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.593881 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.604550 4775 scope.go:117] "RemoveContainer" containerID="24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.638105 4775 scope.go:117] "RemoveContainer" containerID="9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.683088 4775 scope.go:117] "RemoveContainer" containerID="7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.683707 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32\": container with ID starting with 7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32 not found: ID does not exist" containerID="7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.683763 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32"} err="failed to get container status \"7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32\": rpc error: code = NotFound desc = could not find container \"7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32\": container with ID starting with 7936bfbdd39ac4236f42dc3b1a5f1afeebb97f2073e488d3a0319cf2a4373b32 not found: ID does not exist" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.683852 4775 scope.go:117] "RemoveContainer" containerID="43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.684598 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88\": container with ID starting with 43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88 not found: ID does not exist" containerID="43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.684628 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88"} err="failed to get container status \"43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88\": rpc error: code = NotFound desc = could not find container \"43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88\": container with ID starting with 43e0082b100800f58b2a0431ba2b4fe26d29d21e72c3ef68383c656e41044a88 not found: ID does not exist" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.684646 4775 scope.go:117] "RemoveContainer" containerID="24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.684891 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c\": container with ID starting with 24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c not found: ID does not exist" containerID="24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.684975 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c"} err="failed to get container status \"24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c\": rpc error: code = NotFound desc = could not find container \"24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c\": container with ID starting with 24333e83f713910928df657131d8a2ff4da324b21983160084832f69b8c6c78c not found: ID does not exist" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.685001 4775 scope.go:117] "RemoveContainer" containerID="9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46" Jan 27 11:38:47 crc kubenswrapper[4775]: E0127 11:38:47.685586 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46\": container with ID starting with 9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46 not found: ID does not exist" containerID="9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.685630 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46"} err="failed to get container status \"9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46\": rpc error: code = NotFound desc = could not find container \"9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46\": container with ID starting with 9f5d91b55607485d77ea8595e4781bd1416ea42511aebc1fbb6f773f0cdfbb46 not found: ID does not exist" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.748602 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.748695 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-run-httpd\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.748741 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-log-httpd\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.748788 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvx4h\" (UniqueName: \"kubernetes.io/projected/ee6187b7-adff-4247-b9de-00f16380f27f-kube-api-access-tvx4h\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.748851 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-config-data\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.748893 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.748928 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-scripts\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.767291 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="093516f4-3b85-4290-98c0-006f41e91129" path="/var/lib/kubelet/pods/093516f4-3b85-4290-98c0-006f41e91129/volumes" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.849945 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-scripts\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.850007 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.850052 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-run-httpd\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.850518 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-run-httpd\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.850867 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-log-httpd\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.850934 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvx4h\" (UniqueName: \"kubernetes.io/projected/ee6187b7-adff-4247-b9de-00f16380f27f-kube-api-access-tvx4h\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.851018 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-config-data\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.851085 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.851535 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-log-httpd\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.857957 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.858427 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-scripts\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.859834 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-config-data\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.860718 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.873063 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvx4h\" (UniqueName: \"kubernetes.io/projected/ee6187b7-adff-4247-b9de-00f16380f27f-kube-api-access-tvx4h\") pod \"ceilometer-0\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.915010 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:38:47 crc kubenswrapper[4775]: I0127 11:38:47.948520 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-66648b46df-hskmp"] Jan 27 11:38:48 crc kubenswrapper[4775]: I0127 11:38:48.378243 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:48 crc kubenswrapper[4775]: I0127 11:38:48.533513 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-66648b46df-hskmp" event={"ID":"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5","Type":"ContainerStarted","Data":"cc40f7435155744fb8655a00a8bcfac37f639ca67526c1a7fb8e190dfcda6662"} Jan 27 11:38:48 crc kubenswrapper[4775]: I0127 11:38:48.533565 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-66648b46df-hskmp" event={"ID":"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5","Type":"ContainerStarted","Data":"f44610c4492d47323f8fc309fa651e0d922d1202d7775de5462e1c61c3e2c2b0"} Jan 27 11:38:48 crc kubenswrapper[4775]: I0127 11:38:48.533576 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-66648b46df-hskmp" event={"ID":"e22ddb6f-e33b-41ea-a24f-c97c0676e6e5","Type":"ContainerStarted","Data":"dabb6eafd56ad25cc1d4be4c65aba691db7d53364e22e215df7a739017b279d4"} Jan 27 11:38:48 crc kubenswrapper[4775]: I0127 11:38:48.533721 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:48 crc kubenswrapper[4775]: I0127 11:38:48.534880 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerStarted","Data":"8683e19a7bcd30570af286ce01224a28b785c454609defaa562ddd8aa8e80071"} Jan 27 11:38:48 crc kubenswrapper[4775]: I0127 11:38:48.558520 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-66648b46df-hskmp" podStartSLOduration=1.558501506 podStartE2EDuration="1.558501506s" podCreationTimestamp="2026-01-27 11:38:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:48.55213073 +0000 UTC m=+1107.693728517" watchObservedRunningTime="2026-01-27 11:38:48.558501506 +0000 UTC m=+1107.700099283" Jan 27 11:38:49 crc kubenswrapper[4775]: I0127 11:38:49.547408 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerStarted","Data":"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd"} Jan 27 11:38:49 crc kubenswrapper[4775]: I0127 11:38:49.547773 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:50 crc kubenswrapper[4775]: I0127 11:38:50.558282 4775 generic.go:334] "Generic (PLEG): container finished" podID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerID="b63cf0e89854369b83ebb263e9838c2cb8b2524c2ff119bacd1526747a2980ff" exitCode=137 Jan 27 11:38:50 crc kubenswrapper[4775]: I0127 11:38:50.559331 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84666cddfd-6l8vq" event={"ID":"98c20582-df9c-4ed1-8c42-0d5d1783e6f4","Type":"ContainerDied","Data":"b63cf0e89854369b83ebb263e9838c2cb8b2524c2ff119bacd1526747a2980ff"} Jan 27 11:38:50 crc kubenswrapper[4775]: I0127 11:38:50.957612 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 27 11:38:51 crc kubenswrapper[4775]: I0127 11:38:51.635862 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.149359 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-6d876c7c6f-jvj5b"] Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.151440 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.188070 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6695647446-72d6k"] Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.189659 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.197676 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-6d876c7c6f-jvj5b"] Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.207337 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6695647446-72d6k"] Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.269965 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270018 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvqr2\" (UniqueName: \"kubernetes.io/projected/9862a859-ad75-4071-ad9a-ec926175e46d-kube-api-access-hvqr2\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270066 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zn9s\" (UniqueName: \"kubernetes.io/projected/31617f30-7431-401d-8c41-230d6a49ff72-kube-api-access-4zn9s\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270110 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9862a859-ad75-4071-ad9a-ec926175e46d-logs\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270159 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31617f30-7431-401d-8c41-230d6a49ff72-logs\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270184 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data-custom\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270254 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data-custom\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270296 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-combined-ca-bundle\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270322 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-combined-ca-bundle\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.270340 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.296567 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5897cf85c8-ppd2f"] Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.304278 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.354677 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5897cf85c8-ppd2f"] Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371483 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-logs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371548 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31617f30-7431-401d-8c41-230d6a49ff72-logs\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371576 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data-custom\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371609 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data-custom\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371634 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371656 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-combined-ca-bundle\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371681 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-combined-ca-bundle\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371698 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371719 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371740 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvqr2\" (UniqueName: \"kubernetes.io/projected/9862a859-ad75-4071-ad9a-ec926175e46d-kube-api-access-hvqr2\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371760 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data-custom\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371788 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-public-tls-certs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371805 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-internal-tls-certs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371824 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zn9s\" (UniqueName: \"kubernetes.io/projected/31617f30-7431-401d-8c41-230d6a49ff72-kube-api-access-4zn9s\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371856 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-combined-ca-bundle\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371883 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9862a859-ad75-4071-ad9a-ec926175e46d-logs\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.371909 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqt5z\" (UniqueName: \"kubernetes.io/projected/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-kube-api-access-nqt5z\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.372382 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31617f30-7431-401d-8c41-230d6a49ff72-logs\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.375461 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9862a859-ad75-4071-ad9a-ec926175e46d-logs\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.377766 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-combined-ca-bundle\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.378676 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data-custom\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.379221 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-combined-ca-bundle\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.383230 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.391628 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data-custom\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.401399 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zn9s\" (UniqueName: \"kubernetes.io/projected/31617f30-7431-401d-8c41-230d6a49ff72-kube-api-access-4zn9s\") pod \"barbican-worker-6d876c7c6f-jvj5b\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.405554 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvqr2\" (UniqueName: \"kubernetes.io/projected/9862a859-ad75-4071-ad9a-ec926175e46d-kube-api-access-hvqr2\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.412045 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data\") pod \"barbican-keystone-listener-6695647446-72d6k\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.470298 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.474520 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-public-tls-certs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.474638 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-internal-tls-certs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.474752 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-combined-ca-bundle\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.474842 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqt5z\" (UniqueName: \"kubernetes.io/projected/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-kube-api-access-nqt5z\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.474934 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-logs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.475064 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.475181 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data-custom\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.475784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-logs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.479847 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-public-tls-certs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.479911 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-combined-ca-bundle\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.481401 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.484051 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-internal-tls-certs\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.488359 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data-custom\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.501026 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqt5z\" (UniqueName: \"kubernetes.io/projected/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-kube-api-access-nqt5z\") pod \"barbican-api-5897cf85c8-ppd2f\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.516015 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.625289 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.667362 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.667728 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-log" containerID="cri-o://164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b" gracePeriod=30 Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.668236 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-httpd" containerID="cri-o://1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b" gracePeriod=30 Jan 27 11:38:54 crc kubenswrapper[4775]: I0127 11:38:54.748719 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.253629 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-84666cddfd-6l8vq" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.528609 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6695647446-72d6k"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.544191 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6d876c7c6f-jvj5b"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.578547 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5bd6cd4f4f-kxhrc"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.580417 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.594520 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-78f66698d-fbfmx"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.596315 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.609920 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5bd6cd4f4f-kxhrc"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.622198 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78f66698d-fbfmx"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.654399 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5897cf85c8-ppd2f"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.664296 4775 generic.go:334] "Generic (PLEG): container finished" podID="b138b14c-964d-465d-a534-c7aff1633e76" containerID="164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b" exitCode=143 Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.664339 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b138b14c-964d-465d-a534-c7aff1633e76","Type":"ContainerDied","Data":"164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b"} Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.678228 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5d66b74d76-ngwn9"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.679817 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.688734 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d66b74d76-ngwn9"] Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.698051 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pckjb\" (UniqueName: \"kubernetes.io/projected/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-kube-api-access-pckjb\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.698218 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-combined-ca-bundle\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.698306 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1138f75c-8e56-4a32-8110-8b26d9f80688-logs\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.698388 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-config-data\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.698487 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-combined-ca-bundle\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.698826 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-config-data\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.698905 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn24v\" (UniqueName: \"kubernetes.io/projected/1138f75c-8e56-4a32-8110-8b26d9f80688-kube-api-access-zn24v\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.699025 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-logs\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.699068 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-config-data-custom\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.699096 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-config-data-custom\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800521 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-internal-tls-certs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800565 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-config-data-custom\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800586 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-config-data-custom\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800611 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-public-tls-certs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800628 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t772l\" (UniqueName: \"kubernetes.io/projected/8fa6c814-723c-4638-8ae9-dbb9f6864120-kube-api-access-t772l\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800656 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pckjb\" (UniqueName: \"kubernetes.io/projected/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-kube-api-access-pckjb\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800683 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-config-data-custom\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800713 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-combined-ca-bundle\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800735 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1138f75c-8e56-4a32-8110-8b26d9f80688-logs\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800761 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-config-data\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800784 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-combined-ca-bundle\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800805 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-combined-ca-bundle\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800830 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fa6c814-723c-4638-8ae9-dbb9f6864120-logs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800888 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-config-data\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800914 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn24v\" (UniqueName: \"kubernetes.io/projected/1138f75c-8e56-4a32-8110-8b26d9f80688-kube-api-access-zn24v\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800936 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-config-data\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.800967 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-logs\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.801437 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-logs\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.802362 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1138f75c-8e56-4a32-8110-8b26d9f80688-logs\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.806608 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-combined-ca-bundle\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.807063 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-config-data-custom\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.808854 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-combined-ca-bundle\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.809552 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-config-data\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.818097 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-config-data-custom\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.819410 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn24v\" (UniqueName: \"kubernetes.io/projected/1138f75c-8e56-4a32-8110-8b26d9f80688-kube-api-access-zn24v\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.819899 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1138f75c-8e56-4a32-8110-8b26d9f80688-config-data\") pod \"barbican-keystone-listener-78f66698d-fbfmx\" (UID: \"1138f75c-8e56-4a32-8110-8b26d9f80688\") " pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.821811 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pckjb\" (UniqueName: \"kubernetes.io/projected/8874fbc9-9d42-45dd-b38b-9ba1a33340f5-kube-api-access-pckjb\") pod \"barbican-worker-5bd6cd4f4f-kxhrc\" (UID: \"8874fbc9-9d42-45dd-b38b-9ba1a33340f5\") " pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.901956 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-config-data-custom\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.902049 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-combined-ca-bundle\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.902083 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fa6c814-723c-4638-8ae9-dbb9f6864120-logs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.902154 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-config-data\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.902219 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-internal-tls-certs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.902255 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-public-tls-certs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.902268 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t772l\" (UniqueName: \"kubernetes.io/projected/8fa6c814-723c-4638-8ae9-dbb9f6864120-kube-api-access-t772l\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.903713 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fa6c814-723c-4638-8ae9-dbb9f6864120-logs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.905903 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-config-data-custom\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.906679 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-internal-tls-certs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.907193 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-combined-ca-bundle\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.907569 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-config-data\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.909657 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fa6c814-723c-4638-8ae9-dbb9f6864120-public-tls-certs\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.918074 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.922059 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t772l\" (UniqueName: \"kubernetes.io/projected/8fa6c814-723c-4638-8ae9-dbb9f6864120-kube-api-access-t772l\") pod \"barbican-api-5d66b74d76-ngwn9\" (UID: \"8fa6c814-723c-4638-8ae9-dbb9f6864120\") " pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:55 crc kubenswrapper[4775]: I0127 11:38:55.933196 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.002700 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.117994 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.118288 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-log" containerID="cri-o://8ece19255413b1f459b9b434879cd49c181c9d1e505f96017ef83628747fdd1b" gracePeriod=30 Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.118349 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-httpd" containerID="cri-o://2f5a6906cc8f471f0d04ad0bdc4a6f5a9284f2bae71c74883779afada2270d60" gracePeriod=30 Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.124809 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.206873 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-scripts\") pod \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.206990 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-tls-certs\") pod \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.207024 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-config-data\") pod \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.207093 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs2dr\" (UniqueName: \"kubernetes.io/projected/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-kube-api-access-xs2dr\") pod \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.207125 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-secret-key\") pod \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.207173 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-logs\") pod \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.207248 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-combined-ca-bundle\") pod \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\" (UID: \"98c20582-df9c-4ed1-8c42-0d5d1783e6f4\") " Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.207650 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-logs" (OuterVolumeSpecName: "logs") pod "98c20582-df9c-4ed1-8c42-0d5d1783e6f4" (UID: "98c20582-df9c-4ed1-8c42-0d5d1783e6f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.207904 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.213961 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-kube-api-access-xs2dr" (OuterVolumeSpecName: "kube-api-access-xs2dr") pod "98c20582-df9c-4ed1-8c42-0d5d1783e6f4" (UID: "98c20582-df9c-4ed1-8c42-0d5d1783e6f4"). InnerVolumeSpecName "kube-api-access-xs2dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.214279 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "98c20582-df9c-4ed1-8c42-0d5d1783e6f4" (UID: "98c20582-df9c-4ed1-8c42-0d5d1783e6f4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.235694 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-config-data" (OuterVolumeSpecName: "config-data") pod "98c20582-df9c-4ed1-8c42-0d5d1783e6f4" (UID: "98c20582-df9c-4ed1-8c42-0d5d1783e6f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.236052 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "98c20582-df9c-4ed1-8c42-0d5d1783e6f4" (UID: "98c20582-df9c-4ed1-8c42-0d5d1783e6f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.250806 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-scripts" (OuterVolumeSpecName: "scripts") pod "98c20582-df9c-4ed1-8c42-0d5d1783e6f4" (UID: "98c20582-df9c-4ed1-8c42-0d5d1783e6f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.275966 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "98c20582-df9c-4ed1-8c42-0d5d1783e6f4" (UID: "98c20582-df9c-4ed1-8c42-0d5d1783e6f4"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.310256 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.310622 4775 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.310637 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.310650 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs2dr\" (UniqueName: \"kubernetes.io/projected/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-kube-api-access-xs2dr\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.310664 4775 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.310675 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98c20582-df9c-4ed1-8c42-0d5d1783e6f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.691645 4775 generic.go:334] "Generic (PLEG): container finished" podID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerID="8ece19255413b1f459b9b434879cd49c181c9d1e505f96017ef83628747fdd1b" exitCode=143 Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.692316 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134ee9b9-bd65-48fb-9593-d0f29112e77e","Type":"ContainerDied","Data":"8ece19255413b1f459b9b434879cd49c181c9d1e505f96017ef83628747fdd1b"} Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.699205 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerStarted","Data":"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df"} Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.700641 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" event={"ID":"1b5e7b0a-a4d0-4c64-b273-2b47230efd17","Type":"ContainerStarted","Data":"cd7130b87032009eafbd9299811458b2c0b7a08141bac0e7bfbe791fc49ad4d0"} Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.703247 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-84666cddfd-6l8vq" event={"ID":"98c20582-df9c-4ed1-8c42-0d5d1783e6f4","Type":"ContainerDied","Data":"67af1fcb0bcad60b4d6220dc2a58636c77413c902e1d5d58f9a296545b8c138a"} Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.703297 4775 scope.go:117] "RemoveContainer" containerID="0eb18ea0a7e8522aa14ee450ec18f20609f48386c58320c99cc54df7dfbb3f2d" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.703502 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-84666cddfd-6l8vq" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.703881 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5bd6cd4f4f-kxhrc"] Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.732705 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" podStartSLOduration=2.049613738 podStartE2EDuration="12.732691172s" podCreationTimestamp="2026-01-27 11:38:44 +0000 UTC" firstStartedPulling="2026-01-27 11:38:45.63904984 +0000 UTC m=+1104.780647617" lastFinishedPulling="2026-01-27 11:38:56.322127274 +0000 UTC m=+1115.463725051" observedRunningTime="2026-01-27 11:38:56.71203122 +0000 UTC m=+1115.853628997" watchObservedRunningTime="2026-01-27 11:38:56.732691172 +0000 UTC m=+1115.874288949" Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.779587 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-84666cddfd-6l8vq"] Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.788048 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-84666cddfd-6l8vq"] Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.922087 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5897cf85c8-ppd2f"] Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.931882 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6d876c7c6f-jvj5b"] Jan 27 11:38:56 crc kubenswrapper[4775]: I0127 11:38:56.978197 4775 scope.go:117] "RemoveContainer" containerID="b63cf0e89854369b83ebb263e9838c2cb8b2524c2ff119bacd1526747a2980ff" Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.107321 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d66b74d76-ngwn9"] Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.313249 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78f66698d-fbfmx"] Jan 27 11:38:57 crc kubenswrapper[4775]: W0127 11:38:57.313317 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9862a859_ad75_4071_ad9a_ec926175e46d.slice/crio-c2a8847ef3756637a0ac2e98b536e6dfeb366c6e1256763e5e2606e3b7895d3a WatchSource:0}: Error finding container c2a8847ef3756637a0ac2e98b536e6dfeb366c6e1256763e5e2606e3b7895d3a: Status 404 returned error can't find the container with id c2a8847ef3756637a0ac2e98b536e6dfeb366c6e1256763e5e2606e3b7895d3a Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.344551 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6695647446-72d6k"] Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.418184 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.419691 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-66648b46df-hskmp" Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.506430 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-55bc6945f7-5kkp2"] Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.512027 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-55bc6945f7-5kkp2" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-httpd" containerID="cri-o://3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57" gracePeriod=30 Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.515902 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-55bc6945f7-5kkp2" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-server" containerID="cri-o://b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1" gracePeriod=30 Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.741551 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" event={"ID":"31617f30-7431-401d-8c41-230d6a49ff72","Type":"ContainerStarted","Data":"c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.741666 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" event={"ID":"31617f30-7431-401d-8c41-230d6a49ff72","Type":"ContainerStarted","Data":"94dd9f79f758d901a5bff17b96dea4bc02bd0921b66a706ae353879746b66d0f"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.743040 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897cf85c8-ppd2f" event={"ID":"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64","Type":"ContainerStarted","Data":"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.743070 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897cf85c8-ppd2f" event={"ID":"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64","Type":"ContainerStarted","Data":"332dd1d5955a659196a69a4a345219a2406c5c86fb913a32323480dc0fd29f46"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.776444 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" path="/var/lib/kubelet/pods/98c20582-df9c-4ed1-8c42-0d5d1783e6f4/volumes" Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.777511 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" event={"ID":"8874fbc9-9d42-45dd-b38b-9ba1a33340f5","Type":"ContainerStarted","Data":"9b90bab51b264bc1493dfe140fad9990815019bebc9bcefad683cec6ac00649d"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.777660 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" event={"ID":"8874fbc9-9d42-45dd-b38b-9ba1a33340f5","Type":"ContainerStarted","Data":"65d8296d1a0a5c8204814b2dd9d5aae0c11dfced24f191ff27f67fd58d52aaea"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.777791 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" event={"ID":"8874fbc9-9d42-45dd-b38b-9ba1a33340f5","Type":"ContainerStarted","Data":"335fcf56c5d7788d038c75a619f299a0f6b0c61a4fa38fe9339e8278584adfc0"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.783584 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5bd6cd4f4f-kxhrc" podStartSLOduration=2.783565959 podStartE2EDuration="2.783565959s" podCreationTimestamp="2026-01-27 11:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:57.780254228 +0000 UTC m=+1116.921852005" watchObservedRunningTime="2026-01-27 11:38:57.783565959 +0000 UTC m=+1116.925163736" Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.785989 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d66b74d76-ngwn9" event={"ID":"8fa6c814-723c-4638-8ae9-dbb9f6864120","Type":"ContainerStarted","Data":"d6039f83caf53b93ea687a21a178213a03ac82cd1ad840bff24a9e7dff45e91b"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.786036 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d66b74d76-ngwn9" event={"ID":"8fa6c814-723c-4638-8ae9-dbb9f6864120","Type":"ContainerStarted","Data":"3080ba822c9c02a27ac7c0df05b5563a7b1ff6396ab1cf4bf9aed34ec048e87b"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.787489 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" event={"ID":"1138f75c-8e56-4a32-8110-8b26d9f80688","Type":"ContainerStarted","Data":"cbeedb745ed20cd5c9e088a7e51f6dc279f1d562a3587ad830f0e89adee9d852"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.795218 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerStarted","Data":"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.798784 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6695647446-72d6k" event={"ID":"9862a859-ad75-4071-ad9a-ec926175e46d","Type":"ContainerStarted","Data":"c2a8847ef3756637a0ac2e98b536e6dfeb366c6e1256763e5e2606e3b7895d3a"} Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.835084 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-695f7dfd45-zbb58"] Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.835345 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-695f7dfd45-zbb58" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker-log" containerID="cri-o://156c73760afe4bfaf528d085e9a2fb00e063fb27928a61dc8179d4c23fd740db" gracePeriod=30 Jan 27 11:38:57 crc kubenswrapper[4775]: I0127 11:38:57.835767 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-695f7dfd45-zbb58" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker" containerID="cri-o://42504908b6e8629c4bfd13d446379584c5e9631e5f21f9d0d03ceb47fe02eefd" gracePeriod=30 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.353248 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:38:58 crc kubenswrapper[4775]: E0127 11:38:58.443720 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe6b32f3_f53f_43ba_a349_2f00d5e657d0.slice/crio-conmon-b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe6b32f3_f53f_43ba_a349_2f00d5e657d0.slice/crio-b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1.scope\": RecentStats: unable to find data in memory cache]" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495068 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-config-data\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495411 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2wkr\" (UniqueName: \"kubernetes.io/projected/b138b14c-964d-465d-a534-c7aff1633e76-kube-api-access-w2wkr\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495488 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-scripts\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495519 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-combined-ca-bundle\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495573 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-httpd-run\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495615 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-logs\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495630 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-public-tls-certs\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.495660 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b138b14c-964d-465d-a534-c7aff1633e76\" (UID: \"b138b14c-964d-465d-a534-c7aff1633e76\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.497110 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-logs" (OuterVolumeSpecName: "logs") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.497379 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.503547 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.511753 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b138b14c-964d-465d-a534-c7aff1633e76-kube-api-access-w2wkr" (OuterVolumeSpecName: "kube-api-access-w2wkr") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "kube-api-access-w2wkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.540397 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-scripts" (OuterVolumeSpecName: "scripts") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.597404 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.597430 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.597439 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b138b14c-964d-465d-a534-c7aff1633e76-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.597482 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.597496 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2wkr\" (UniqueName: \"kubernetes.io/projected/b138b14c-964d-465d-a534-c7aff1633e76-kube-api-access-w2wkr\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.622582 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.634619 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-config-data" (OuterVolumeSpecName: "config-data") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.642949 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.643299 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "b138b14c-964d-465d-a534-c7aff1633e76" (UID: "b138b14c-964d-465d-a534-c7aff1633e76"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.701167 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.701202 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.701213 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.701224 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b138b14c-964d-465d-a534-c7aff1633e76-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.819998 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.839210 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897cf85c8-ppd2f" event={"ID":"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64","Type":"ContainerStarted","Data":"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.839387 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5897cf85c8-ppd2f" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api-log" containerID="cri-o://c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df" gracePeriod=30 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.839637 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.839667 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.839696 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5897cf85c8-ppd2f" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api" containerID="cri-o://a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27" gracePeriod=30 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.875896 4775 generic.go:334] "Generic (PLEG): container finished" podID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerID="156c73760afe4bfaf528d085e9a2fb00e063fb27928a61dc8179d4c23fd740db" exitCode=143 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.875974 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-695f7dfd45-zbb58" event={"ID":"ac6a9582-6a97-46b4-aa84-35ca9abe695c","Type":"ContainerDied","Data":"156c73760afe4bfaf528d085e9a2fb00e063fb27928a61dc8179d4c23fd740db"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.890089 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d66b74d76-ngwn9" event={"ID":"8fa6c814-723c-4638-8ae9-dbb9f6864120","Type":"ContainerStarted","Data":"f18c44ffc3fe1dd758152a1e96e9f0872974147028a0adf96b9ca33df41bef76"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.891199 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.891225 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.903881 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-log-httpd\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.903968 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-combined-ca-bundle\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.904029 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-public-tls-certs\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.904107 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc4bh\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-kube-api-access-fc4bh\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.904158 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-internal-tls-certs\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.904208 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-etc-swift\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.904229 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-config-data\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.904287 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-run-httpd\") pod \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\" (UID: \"fe6b32f3-f53f-43ba-a349-2f00d5e657d0\") " Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.906402 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.910161 4775 generic.go:334] "Generic (PLEG): container finished" podID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerID="b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1" exitCode=0 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.910193 4775 generic.go:334] "Generic (PLEG): container finished" podID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerID="3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57" exitCode=0 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.910254 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-55bc6945f7-5kkp2" event={"ID":"fe6b32f3-f53f-43ba-a349-2f00d5e657d0","Type":"ContainerDied","Data":"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.910280 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-55bc6945f7-5kkp2" event={"ID":"fe6b32f3-f53f-43ba-a349-2f00d5e657d0","Type":"ContainerDied","Data":"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.910290 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-55bc6945f7-5kkp2" event={"ID":"fe6b32f3-f53f-43ba-a349-2f00d5e657d0","Type":"ContainerDied","Data":"fa5db8a5c7621b855f9aee7c911007cac93d44ed2023e821a1db694da3d675fa"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.910305 4775 scope.go:117] "RemoveContainer" containerID="b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.912204 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-55bc6945f7-5kkp2" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.924776 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.943758 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" event={"ID":"1138f75c-8e56-4a32-8110-8b26d9f80688","Type":"ContainerStarted","Data":"c7cf618249468439ab9947bd79c2502d8d395f3ad8e3cc7a54d72b00a23938fe"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.943812 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" event={"ID":"1138f75c-8e56-4a32-8110-8b26d9f80688","Type":"ContainerStarted","Data":"3d2983e2073a6e23919993569a2db1e777666c79d856b0a59c0ce0ca9ce6a54e"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.969141 4775 generic.go:334] "Generic (PLEG): container finished" podID="b138b14c-964d-465d-a534-c7aff1633e76" containerID="1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b" exitCode=0 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.969204 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b138b14c-964d-465d-a534-c7aff1633e76","Type":"ContainerDied","Data":"1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.969232 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"b138b14c-964d-465d-a534-c7aff1633e76","Type":"ContainerDied","Data":"4be346d9744f80cbe9acdb090392b9c63c5e0cb6ed893fe6b3ae4a4e7c97ad5e"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.969287 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.977668 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-kube-api-access-fc4bh" (OuterVolumeSpecName: "kube-api-access-fc4bh") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "kube-api-access-fc4bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.977781 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.978485 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5897cf85c8-ppd2f" podStartSLOduration=4.978468775 podStartE2EDuration="4.978468775s" podCreationTimestamp="2026-01-27 11:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:58.876166593 +0000 UTC m=+1118.017764360" watchObservedRunningTime="2026-01-27 11:38:58.978468775 +0000 UTC m=+1118.120066552" Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.993236 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6695647446-72d6k" event={"ID":"9862a859-ad75-4071-ad9a-ec926175e46d","Type":"ContainerStarted","Data":"5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.993280 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6695647446-72d6k" event={"ID":"9862a859-ad75-4071-ad9a-ec926175e46d","Type":"ContainerStarted","Data":"9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3"} Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.993374 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6695647446-72d6k" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener-log" containerID="cri-o://9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3" gracePeriod=30 Jan 27 11:38:58 crc kubenswrapper[4775]: I0127 11:38:58.993644 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6695647446-72d6k" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener" containerID="cri-o://5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba" gracePeriod=30 Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.004839 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5d66b74d76-ngwn9" podStartSLOduration=4.004819805 podStartE2EDuration="4.004819805s" podCreationTimestamp="2026-01-27 11:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:58.91763095 +0000 UTC m=+1118.059228727" watchObservedRunningTime="2026-01-27 11:38:59.004819805 +0000 UTC m=+1118.146417582" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.010149 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc4bh\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-kube-api-access-fc4bh\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.010180 4775 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.010192 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.010203 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.012959 4775 scope.go:117] "RemoveContainer" containerID="3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.016000 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker-log" containerID="cri-o://c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63" gracePeriod=30 Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.016183 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" event={"ID":"31617f30-7431-401d-8c41-230d6a49ff72","Type":"ContainerStarted","Data":"7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f"} Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.016213 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker" containerID="cri-o://7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f" gracePeriod=30 Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.039122 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-78f66698d-fbfmx" podStartSLOduration=4.039105354 podStartE2EDuration="4.039105354s" podCreationTimestamp="2026-01-27 11:38:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:58.96416745 +0000 UTC m=+1118.105765227" watchObservedRunningTime="2026-01-27 11:38:59.039105354 +0000 UTC m=+1118.180703131" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.046083 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.051290 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-config-data" (OuterVolumeSpecName: "config-data") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.081558 4775 scope.go:117] "RemoveContainer" containerID="b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.082520 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-667698bbc6-zpl9x"] Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.082826 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener-log" containerID="cri-o://0fa47ced9f0a1a66931599424fb0e02e42c9c45fd055acdeb51c078cfec19eb2" gracePeriod=30 Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.082964 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener" containerID="cri-o://9d13207bfa59faf596deb2d40a70b14097428a29e9cd2f29e431ec69fafe695f" gracePeriod=30 Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.085055 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.085164 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1\": container with ID starting with b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1 not found: ID does not exist" containerID="b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.085198 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1"} err="failed to get container status \"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1\": rpc error: code = NotFound desc = could not find container \"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1\": container with ID starting with b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1 not found: ID does not exist" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.085225 4775 scope.go:117] "RemoveContainer" containerID="3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.096607 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fe6b32f3-f53f-43ba-a349-2f00d5e657d0" (UID: "fe6b32f3-f53f-43ba-a349-2f00d5e657d0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.101640 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57\": container with ID starting with 3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57 not found: ID does not exist" containerID="3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.101692 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57"} err="failed to get container status \"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57\": rpc error: code = NotFound desc = could not find container \"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57\": container with ID starting with 3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57 not found: ID does not exist" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.101720 4775 scope.go:117] "RemoveContainer" containerID="b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.105189 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1"} err="failed to get container status \"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1\": rpc error: code = NotFound desc = could not find container \"b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1\": container with ID starting with b59da22053248a85f944b1cf1fecd51f6d51559136428a02130c38a5615770a1 not found: ID does not exist" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.105221 4775 scope.go:117] "RemoveContainer" containerID="3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.105275 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.110503 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57"} err="failed to get container status \"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57\": rpc error: code = NotFound desc = could not find container \"3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57\": container with ID starting with 3399dfeed6fd623ae88ef38084c71f2152c0a3e7cdaaa06f54e73ddf54965e57 not found: ID does not exist" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.111217 4775 scope.go:117] "RemoveContainer" containerID="1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.112654 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.112937 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.112978 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.112992 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.113006 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fe6b32f3-f53f-43ba-a349-2f00d5e657d0-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.123547 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.124021 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-httpd" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124041 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-httpd" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.124052 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124060 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.124075 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon-log" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124083 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon-log" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.124109 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-httpd" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124115 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-httpd" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.124133 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-log" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124140 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-log" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.124154 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-server" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124160 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-server" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124340 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon-log" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124358 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="98c20582-df9c-4ed1-8c42-0d5d1783e6f4" containerName="horizon" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124369 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-httpd" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124379 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" containerName="proxy-server" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124396 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-log" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.124406 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b138b14c-964d-465d-a534-c7aff1633e76" containerName="glance-httpd" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.125612 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.133416 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.133532 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.151250 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" podStartSLOduration=5.151228649 podStartE2EDuration="5.151228649s" podCreationTimestamp="2026-01-27 11:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:59.054275714 +0000 UTC m=+1118.195873491" watchObservedRunningTime="2026-01-27 11:38:59.151228649 +0000 UTC m=+1118.292826416" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.171509 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.178569 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6695647446-72d6k" podStartSLOduration=5.178436213 podStartE2EDuration="5.178436213s" podCreationTimestamp="2026-01-27 11:38:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:38:59.091828114 +0000 UTC m=+1118.233425901" watchObservedRunningTime="2026-01-27 11:38:59.178436213 +0000 UTC m=+1118.320033990" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.193620 4775 scope.go:117] "RemoveContainer" containerID="164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214422 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-scripts\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214489 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214539 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214569 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-logs\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214609 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214635 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjc88\" (UniqueName: \"kubernetes.io/projected/34fbc599-e3e9-4317-a306-f1b4d677cd84-kube-api-access-xjc88\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214691 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-config-data\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.214740 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.254642 4775 scope.go:117] "RemoveContainer" containerID="1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.267793 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b\": container with ID starting with 1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b not found: ID does not exist" containerID="1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.267838 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b"} err="failed to get container status \"1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b\": rpc error: code = NotFound desc = could not find container \"1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b\": container with ID starting with 1f5d3eb15cd75f08d3fb8a7d5c7a13b7c27e4d7c50373f1b425ac715350dbc3b not found: ID does not exist" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.267865 4775 scope.go:117] "RemoveContainer" containerID="164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b" Jan 27 11:38:59 crc kubenswrapper[4775]: E0127 11:38:59.269463 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b\": container with ID starting with 164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b not found: ID does not exist" containerID="164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.269516 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b"} err="failed to get container status \"164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b\": rpc error: code = NotFound desc = could not find container \"164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b\": container with ID starting with 164925650d571dc00cefe4936c9221a6401312d5a765a3a5fd77ad5f0c3b393b not found: ID does not exist" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.288326 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-55bc6945f7-5kkp2"] Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.297555 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-55bc6945f7-5kkp2"] Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318541 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-logs\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318590 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318624 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjc88\" (UniqueName: \"kubernetes.io/projected/34fbc599-e3e9-4317-a306-f1b4d677cd84-kube-api-access-xjc88\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318688 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-config-data\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318745 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318792 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-scripts\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318819 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.318843 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.319079 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-logs\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.319134 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.319226 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.322847 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.322960 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.323272 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-scripts\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.329419 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-config-data\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.339535 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjc88\" (UniqueName: \"kubernetes.io/projected/34fbc599-e3e9-4317-a306-f1b4d677cd84-kube-api-access-xjc88\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.356149 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.470519 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.520340 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.520398 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.800872 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b138b14c-964d-465d-a534-c7aff1633e76" path="/var/lib/kubelet/pods/b138b14c-964d-465d-a534-c7aff1633e76/volumes" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.807280 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe6b32f3-f53f-43ba-a349-2f00d5e657d0" path="/var/lib/kubelet/pods/fe6b32f3-f53f-43ba-a349-2f00d5e657d0/volumes" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.845139 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.861893 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.928242 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-combined-ca-bundle\") pod \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.928296 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data-custom\") pod \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.928410 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data\") pod \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.928518 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqt5z\" (UniqueName: \"kubernetes.io/projected/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-kube-api-access-nqt5z\") pod \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.928587 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-logs\") pod \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.928602 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-public-tls-certs\") pod \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.928664 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-internal-tls-certs\") pod \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\" (UID: \"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64\") " Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.933084 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-logs" (OuterVolumeSpecName: "logs") pod "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" (UID: "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.936727 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-kube-api-access-nqt5z" (OuterVolumeSpecName: "kube-api-access-nqt5z") pod "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" (UID: "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64"). InnerVolumeSpecName "kube-api-access-nqt5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.942626 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" (UID: "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:38:59 crc kubenswrapper[4775]: I0127 11:38:59.996619 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" (UID: "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.019750 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data" (OuterVolumeSpecName: "config-data") pod "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" (UID: "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.030411 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqt5z\" (UniqueName: \"kubernetes.io/projected/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-kube-api-access-nqt5z\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.030439 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.030466 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.030479 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.030491 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.032546 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" (UID: "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.050708 4775 generic.go:334] "Generic (PLEG): container finished" podID="9862a859-ad75-4071-ad9a-ec926175e46d" containerID="9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3" exitCode=143 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.050879 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6695647446-72d6k" event={"ID":"9862a859-ad75-4071-ad9a-ec926175e46d","Type":"ContainerDied","Data":"9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.065600 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" (UID: "ae7b9163-a675-4ed6-bedb-ce8f72dc6a64"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.067711 4775 generic.go:334] "Generic (PLEG): container finished" podID="31617f30-7431-401d-8c41-230d6a49ff72" containerID="c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63" exitCode=143 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.067802 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" event={"ID":"31617f30-7431-401d-8c41-230d6a49ff72","Type":"ContainerDied","Data":"c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.082622 4775 generic.go:334] "Generic (PLEG): container finished" podID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerID="a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27" exitCode=0 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.082677 4775 generic.go:334] "Generic (PLEG): container finished" podID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerID="c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df" exitCode=143 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.082733 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897cf85c8-ppd2f" event={"ID":"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64","Type":"ContainerDied","Data":"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.082792 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897cf85c8-ppd2f" event={"ID":"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64","Type":"ContainerDied","Data":"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.082803 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5897cf85c8-ppd2f" event={"ID":"ae7b9163-a675-4ed6-bedb-ce8f72dc6a64","Type":"ContainerDied","Data":"332dd1d5955a659196a69a4a345219a2406c5c86fb913a32323480dc0fd29f46"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.082826 4775 scope.go:117] "RemoveContainer" containerID="a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.083055 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5897cf85c8-ppd2f" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.110549 4775 generic.go:334] "Generic (PLEG): container finished" podID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerID="2f5a6906cc8f471f0d04ad0bdc4a6f5a9284f2bae71c74883779afada2270d60" exitCode=0 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.110675 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134ee9b9-bd65-48fb-9593-d0f29112e77e","Type":"ContainerDied","Data":"2f5a6906cc8f471f0d04ad0bdc4a6f5a9284f2bae71c74883779afada2270d60"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.120846 4775 generic.go:334] "Generic (PLEG): container finished" podID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerID="9d13207bfa59faf596deb2d40a70b14097428a29e9cd2f29e431ec69fafe695f" exitCode=0 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.120895 4775 generic.go:334] "Generic (PLEG): container finished" podID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerID="0fa47ced9f0a1a66931599424fb0e02e42c9c45fd055acdeb51c078cfec19eb2" exitCode=143 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.120935 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" event={"ID":"ca1756aa-c8c1-4f8e-9871-05e044a80c84","Type":"ContainerDied","Data":"9d13207bfa59faf596deb2d40a70b14097428a29e9cd2f29e431ec69fafe695f"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.120984 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" event={"ID":"ca1756aa-c8c1-4f8e-9871-05e044a80c84","Type":"ContainerDied","Data":"0fa47ced9f0a1a66931599424fb0e02e42c9c45fd055acdeb51c078cfec19eb2"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.132205 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerStarted","Data":"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15"} Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.132429 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-central-agent" containerID="cri-o://1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" gracePeriod=30 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.132759 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.132823 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="proxy-httpd" containerID="cri-o://cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" gracePeriod=30 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.132875 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-notification-agent" containerID="cri-o://c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" gracePeriod=30 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.132954 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="sg-core" containerID="cri-o://164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" gracePeriod=30 Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.152111 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.152150 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.174324 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.175320 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5897cf85c8-ppd2f"] Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.231025 4775 scope.go:117] "RemoveContainer" containerID="c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.234620 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5897cf85c8-ppd2f"] Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.243219 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.614767303 podStartE2EDuration="13.243190384s" podCreationTimestamp="2026-01-27 11:38:47 +0000 UTC" firstStartedPulling="2026-01-27 11:38:48.391289366 +0000 UTC m=+1107.532887143" lastFinishedPulling="2026-01-27 11:38:59.019712447 +0000 UTC m=+1118.161310224" observedRunningTime="2026-01-27 11:39:00.167819377 +0000 UTC m=+1119.309417154" watchObservedRunningTime="2026-01-27 11:39:00.243190384 +0000 UTC m=+1119.384788161" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.253697 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-httpd-run\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.253861 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-logs\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.253906 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-config-data\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.253941 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-internal-tls-certs\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.254026 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-combined-ca-bundle\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.254065 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-scripts\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.254097 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7wrh\" (UniqueName: \"kubernetes.io/projected/134ee9b9-bd65-48fb-9593-d0f29112e77e-kube-api-access-p7wrh\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.254357 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"134ee9b9-bd65-48fb-9593-d0f29112e77e\" (UID: \"134ee9b9-bd65-48fb-9593-d0f29112e77e\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.255324 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-logs" (OuterVolumeSpecName: "logs") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.255622 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.256728 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.256753 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/134ee9b9-bd65-48fb-9593-d0f29112e77e-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.270631 4775 scope.go:117] "RemoveContainer" containerID="a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.277174 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.277683 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.283598 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-scripts" (OuterVolumeSpecName: "scripts") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: E0127 11:39:00.283812 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27\": container with ID starting with a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27 not found: ID does not exist" containerID="a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.283849 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27"} err="failed to get container status \"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27\": rpc error: code = NotFound desc = could not find container \"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27\": container with ID starting with a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27 not found: ID does not exist" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.283874 4775 scope.go:117] "RemoveContainer" containerID="c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df" Jan 27 11:39:00 crc kubenswrapper[4775]: E0127 11:39:00.285543 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df\": container with ID starting with c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df not found: ID does not exist" containerID="c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.285565 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df"} err="failed to get container status \"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df\": rpc error: code = NotFound desc = could not find container \"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df\": container with ID starting with c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df not found: ID does not exist" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.285579 4775 scope.go:117] "RemoveContainer" containerID="a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.291814 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27"} err="failed to get container status \"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27\": rpc error: code = NotFound desc = could not find container \"a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27\": container with ID starting with a9bb4c137fca5d76dc1be74649e2776aaaec2b77ef7f66efd43133d71b16dc27 not found: ID does not exist" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.291857 4775 scope.go:117] "RemoveContainer" containerID="c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.309099 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134ee9b9-bd65-48fb-9593-d0f29112e77e-kube-api-access-p7wrh" (OuterVolumeSpecName: "kube-api-access-p7wrh") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "kube-api-access-p7wrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.309667 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df"} err="failed to get container status \"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df\": rpc error: code = NotFound desc = could not find container \"c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df\": container with ID starting with c3cadbc5bc34b3252524913f5a7a1b77a8e951bad18c5e5e6305b71b847bd8df not found: ID does not exist" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.354824 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.355582 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-config-data" (OuterVolumeSpecName: "config-data") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.359130 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "134ee9b9-bd65-48fb-9593-d0f29112e77e" (UID: "134ee9b9-bd65-48fb-9593-d0f29112e77e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.359423 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.359466 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.359482 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.359492 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.359501 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/134ee9b9-bd65-48fb-9593-d0f29112e77e-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.359510 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7wrh\" (UniqueName: \"kubernetes.io/projected/134ee9b9-bd65-48fb-9593-d0f29112e77e-kube-api-access-p7wrh\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.363528 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.427695 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.463201 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data-custom\") pod \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.463245 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjdqm\" (UniqueName: \"kubernetes.io/projected/ca1756aa-c8c1-4f8e-9871-05e044a80c84-kube-api-access-zjdqm\") pod \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.463371 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data\") pod \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.463404 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-combined-ca-bundle\") pod \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.463540 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca1756aa-c8c1-4f8e-9871-05e044a80c84-logs\") pod \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\" (UID: \"ca1756aa-c8c1-4f8e-9871-05e044a80c84\") " Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.463902 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.468028 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca1756aa-c8c1-4f8e-9871-05e044a80c84-logs" (OuterVolumeSpecName: "logs") pod "ca1756aa-c8c1-4f8e-9871-05e044a80c84" (UID: "ca1756aa-c8c1-4f8e-9871-05e044a80c84"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.473289 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ca1756aa-c8c1-4f8e-9871-05e044a80c84" (UID: "ca1756aa-c8c1-4f8e-9871-05e044a80c84"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.479480 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1756aa-c8c1-4f8e-9871-05e044a80c84-kube-api-access-zjdqm" (OuterVolumeSpecName: "kube-api-access-zjdqm") pod "ca1756aa-c8c1-4f8e-9871-05e044a80c84" (UID: "ca1756aa-c8c1-4f8e-9871-05e044a80c84"). InnerVolumeSpecName "kube-api-access-zjdqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.510240 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca1756aa-c8c1-4f8e-9871-05e044a80c84" (UID: "ca1756aa-c8c1-4f8e-9871-05e044a80c84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.565879 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.565920 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca1756aa-c8c1-4f8e-9871-05e044a80c84-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.565929 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.565938 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjdqm\" (UniqueName: \"kubernetes.io/projected/ca1756aa-c8c1-4f8e-9871-05e044a80c84-kube-api-access-zjdqm\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.689315 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data" (OuterVolumeSpecName: "config-data") pod "ca1756aa-c8c1-4f8e-9871-05e044a80c84" (UID: "ca1756aa-c8c1-4f8e-9871-05e044a80c84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:00 crc kubenswrapper[4775]: I0127 11:39:00.769354 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1756aa-c8c1-4f8e-9871-05e044a80c84-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.164202 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134ee9b9-bd65-48fb-9593-d0f29112e77e","Type":"ContainerDied","Data":"815ca40b27fb4cea044b33dd23bf33c1b082f912269530f93879da29eb229030"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.164251 4775 scope.go:117] "RemoveContainer" containerID="2f5a6906cc8f471f0d04ad0bdc4a6f5a9284f2bae71c74883779afada2270d60" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.164383 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.169846 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.170162 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" event={"ID":"ca1756aa-c8c1-4f8e-9871-05e044a80c84","Type":"ContainerDied","Data":"7725e0d31cab8fdd988ddc82ff5c6e00f8aac8edb67890b0869f5c2b5c515d21"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.170232 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-667698bbc6-zpl9x" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212229 4775 generic.go:334] "Generic (PLEG): container finished" podID="ee6187b7-adff-4247-b9de-00f16380f27f" containerID="cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" exitCode=0 Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212262 4775 generic.go:334] "Generic (PLEG): container finished" podID="ee6187b7-adff-4247-b9de-00f16380f27f" containerID="164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" exitCode=2 Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212272 4775 generic.go:334] "Generic (PLEG): container finished" podID="ee6187b7-adff-4247-b9de-00f16380f27f" containerID="c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" exitCode=0 Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212278 4775 generic.go:334] "Generic (PLEG): container finished" podID="ee6187b7-adff-4247-b9de-00f16380f27f" containerID="1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" exitCode=0 Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212317 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerDied","Data":"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212339 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerDied","Data":"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212350 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerDied","Data":"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212360 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerDied","Data":"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212368 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ee6187b7-adff-4247-b9de-00f16380f27f","Type":"ContainerDied","Data":"8683e19a7bcd30570af286ce01224a28b785c454609defaa562ddd8aa8e80071"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.212430 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.229898 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34fbc599-e3e9-4317-a306-f1b4d677cd84","Type":"ContainerStarted","Data":"866d4ea18d0c5a4b3ffee3bd292c679cf834786becc86951c207630b9977d97c"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.234697 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.240769 4775 scope.go:117] "RemoveContainer" containerID="8ece19255413b1f459b9b434879cd49c181c9d1e505f96017ef83628747fdd1b" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.278823 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.279587 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvx4h\" (UniqueName: \"kubernetes.io/projected/ee6187b7-adff-4247-b9de-00f16380f27f-kube-api-access-tvx4h\") pod \"ee6187b7-adff-4247-b9de-00f16380f27f\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.279625 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-combined-ca-bundle\") pod \"ee6187b7-adff-4247-b9de-00f16380f27f\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.279674 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-scripts\") pod \"ee6187b7-adff-4247-b9de-00f16380f27f\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.279707 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-run-httpd\") pod \"ee6187b7-adff-4247-b9de-00f16380f27f\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.279736 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-sg-core-conf-yaml\") pod \"ee6187b7-adff-4247-b9de-00f16380f27f\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.279783 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-config-data\") pod \"ee6187b7-adff-4247-b9de-00f16380f27f\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.279830 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-log-httpd\") pod \"ee6187b7-adff-4247-b9de-00f16380f27f\" (UID: \"ee6187b7-adff-4247-b9de-00f16380f27f\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.280750 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ee6187b7-adff-4247-b9de-00f16380f27f" (UID: "ee6187b7-adff-4247-b9de-00f16380f27f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.281112 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ee6187b7-adff-4247-b9de-00f16380f27f" (UID: "ee6187b7-adff-4247-b9de-00f16380f27f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.283570 4775 generic.go:334] "Generic (PLEG): container finished" podID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerID="42504908b6e8629c4bfd13d446379584c5e9631e5f21f9d0d03ceb47fe02eefd" exitCode=0 Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.284431 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-695f7dfd45-zbb58" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.284587 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-695f7dfd45-zbb58" event={"ID":"ac6a9582-6a97-46b4-aa84-35ca9abe695c","Type":"ContainerDied","Data":"42504908b6e8629c4bfd13d446379584c5e9631e5f21f9d0d03ceb47fe02eefd"} Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.296603 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-scripts" (OuterVolumeSpecName: "scripts") pod "ee6187b7-adff-4247-b9de-00f16380f27f" (UID: "ee6187b7-adff-4247-b9de-00f16380f27f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.303907 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee6187b7-adff-4247-b9de-00f16380f27f-kube-api-access-tvx4h" (OuterVolumeSpecName: "kube-api-access-tvx4h") pod "ee6187b7-adff-4247-b9de-00f16380f27f" (UID: "ee6187b7-adff-4247-b9de-00f16380f27f"). InnerVolumeSpecName "kube-api-access-tvx4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.319613 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.325092 4775 scope.go:117] "RemoveContainer" containerID="9d13207bfa59faf596deb2d40a70b14097428a29e9cd2f29e431ec69fafe695f" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.343800 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-667698bbc6-zpl9x"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.394290 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac6a9582-6a97-46b4-aa84-35ca9abe695c-logs\") pod \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.394360 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data-custom\") pod \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.394401 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-combined-ca-bundle\") pod \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.394514 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh92h\" (UniqueName: \"kubernetes.io/projected/ac6a9582-6a97-46b4-aa84-35ca9abe695c-kube-api-access-hh92h\") pod \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.394544 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data\") pod \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\" (UID: \"ac6a9582-6a97-46b4-aa84-35ca9abe695c\") " Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.397082 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac6a9582-6a97-46b4-aa84-35ca9abe695c-logs" (OuterVolumeSpecName: "logs") pod "ac6a9582-6a97-46b4-aa84-35ca9abe695c" (UID: "ac6a9582-6a97-46b4-aa84-35ca9abe695c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.398877 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.398893 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.398910 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ee6187b7-adff-4247-b9de-00f16380f27f-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.398920 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ac6a9582-6a97-46b4-aa84-35ca9abe695c-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.398929 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvx4h\" (UniqueName: \"kubernetes.io/projected/ee6187b7-adff-4247-b9de-00f16380f27f-kube-api-access-tvx4h\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.407627 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-667698bbc6-zpl9x"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.408364 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac6a9582-6a97-46b4-aa84-35ca9abe695c-kube-api-access-hh92h" (OuterVolumeSpecName: "kube-api-access-hh92h") pod "ac6a9582-6a97-46b4-aa84-35ca9abe695c" (UID: "ac6a9582-6a97-46b4-aa84-35ca9abe695c"). InnerVolumeSpecName "kube-api-access-hh92h". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.408411 4775 scope.go:117] "RemoveContainer" containerID="0fa47ced9f0a1a66931599424fb0e02e42c9c45fd055acdeb51c078cfec19eb2" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.412865 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ac6a9582-6a97-46b4-aa84-35ca9abe695c" (UID: "ac6a9582-6a97-46b4-aa84-35ca9abe695c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424347 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424721 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424737 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424754 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="sg-core" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424760 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="sg-core" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424770 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-httpd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424775 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-httpd" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424784 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-notification-agent" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424789 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-notification-agent" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424798 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424804 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-log" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424815 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424820 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api-log" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424830 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424836 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker-log" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424843 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424849 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener-log" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424861 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="proxy-httpd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424867 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="proxy-httpd" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424883 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424890 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424901 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-central-agent" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424909 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-central-agent" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.424927 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.424933 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425072 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425084 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425095 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" containerName="glance-httpd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425101 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425114 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" containerName="barbican-keystone-listener" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425126 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-central-agent" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425139 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="sg-core" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425146 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker-log" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425156 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" containerName="barbican-api" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425168 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="ceilometer-notification-agent" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425176 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" containerName="proxy-httpd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.425187 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" containerName="barbican-worker" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.426130 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.428511 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.428664 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.428912 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac6a9582-6a97-46b4-aa84-35ca9abe695c" (UID: "ac6a9582-6a97-46b4-aa84-35ca9abe695c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.447904 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee6187b7-adff-4247-b9de-00f16380f27f" (UID: "ee6187b7-adff-4247-b9de-00f16380f27f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.449175 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.463334 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ee6187b7-adff-4247-b9de-00f16380f27f" (UID: "ee6187b7-adff-4247-b9de-00f16380f27f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.474255 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data" (OuterVolumeSpecName: "config-data") pod "ac6a9582-6a97-46b4-aa84-35ca9abe695c" (UID: "ac6a9582-6a97-46b4-aa84-35ca9abe695c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.494113 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-config-data" (OuterVolumeSpecName: "config-data") pod "ee6187b7-adff-4247-b9de-00f16380f27f" (UID: "ee6187b7-adff-4247-b9de-00f16380f27f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501153 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501200 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501225 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-logs\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501249 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501264 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501295 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501338 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501799 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz8mh\" (UniqueName: \"kubernetes.io/projected/d4944db9-7805-486d-bd2f-38245c9eecbf-kube-api-access-zz8mh\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.501986 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh92h\" (UniqueName: \"kubernetes.io/projected/ac6a9582-6a97-46b4-aa84-35ca9abe695c-kube-api-access-hh92h\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.508516 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.508548 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.508560 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.508571 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.508582 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac6a9582-6a97-46b4-aa84-35ca9abe695c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.508591 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee6187b7-adff-4247-b9de-00f16380f27f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.585507 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.600554 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.609956 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610057 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz8mh\" (UniqueName: \"kubernetes.io/projected/d4944db9-7805-486d-bd2f-38245c9eecbf-kube-api-access-zz8mh\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610112 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610136 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610157 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-logs\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610178 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610194 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610221 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610777 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.610888 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-logs\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.611659 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.616170 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.616235 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.621596 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.627055 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.636747 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.636847 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.645339 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.645569 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.645601 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.650767 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz8mh\" (UniqueName: \"kubernetes.io/projected/d4944db9-7805-486d-bd2f-38245c9eecbf-kube-api-access-zz8mh\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.650827 4775 scope.go:117] "RemoveContainer" containerID="cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.656051 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.667621 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-695f7dfd45-zbb58"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.677550 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-695f7dfd45-zbb58"] Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.711778 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.711825 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-scripts\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.711946 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-config-data\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.711967 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-run-httpd\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.711994 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skgc8\" (UniqueName: \"kubernetes.io/projected/0e7f615a-76c9-440f-aee6-0d33ad750021-kube-api-access-skgc8\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.712042 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-log-httpd\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.713612 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.725560 4775 scope.go:117] "RemoveContainer" containerID="164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.755866 4775 scope.go:117] "RemoveContainer" containerID="c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.765347 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="134ee9b9-bd65-48fb-9593-d0f29112e77e" path="/var/lib/kubelet/pods/134ee9b9-bd65-48fb-9593-d0f29112e77e/volumes" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.765979 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac6a9582-6a97-46b4-aa84-35ca9abe695c" path="/var/lib/kubelet/pods/ac6a9582-6a97-46b4-aa84-35ca9abe695c/volumes" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.766678 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae7b9163-a675-4ed6-bedb-ce8f72dc6a64" path="/var/lib/kubelet/pods/ae7b9163-a675-4ed6-bedb-ce8f72dc6a64/volumes" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.767801 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1756aa-c8c1-4f8e-9871-05e044a80c84" path="/var/lib/kubelet/pods/ca1756aa-c8c1-4f8e-9871-05e044a80c84/volumes" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.768379 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee6187b7-adff-4247-b9de-00f16380f27f" path="/var/lib/kubelet/pods/ee6187b7-adff-4247-b9de-00f16380f27f/volumes" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.782158 4775 scope.go:117] "RemoveContainer" containerID="1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.814999 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skgc8\" (UniqueName: \"kubernetes.io/projected/0e7f615a-76c9-440f-aee6-0d33ad750021-kube-api-access-skgc8\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.815071 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-log-httpd\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.815097 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.815143 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.815161 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-scripts\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.815252 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-config-data\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.815273 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-run-httpd\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.816801 4775 scope.go:117] "RemoveContainer" containerID="cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.816998 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-log-httpd\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.817988 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": container with ID starting with cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15 not found: ID does not exist" containerID="cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.818032 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15"} err="failed to get container status \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": rpc error: code = NotFound desc = could not find container \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": container with ID starting with cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.818057 4775 scope.go:117] "RemoveContainer" containerID="164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.818169 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-run-httpd\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.818343 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": container with ID starting with 164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4 not found: ID does not exist" containerID="164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.818376 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4"} err="failed to get container status \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": rpc error: code = NotFound desc = could not find container \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": container with ID starting with 164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.818395 4775 scope.go:117] "RemoveContainer" containerID="c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.819380 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": container with ID starting with c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df not found: ID does not exist" containerID="c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.819419 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df"} err="failed to get container status \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": rpc error: code = NotFound desc = could not find container \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": container with ID starting with c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.819432 4775 scope.go:117] "RemoveContainer" containerID="1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" Jan 27 11:39:01 crc kubenswrapper[4775]: E0127 11:39:01.819658 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": container with ID starting with 1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd not found: ID does not exist" containerID="1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.819678 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd"} err="failed to get container status \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": rpc error: code = NotFound desc = could not find container \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": container with ID starting with 1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.819712 4775 scope.go:117] "RemoveContainer" containerID="cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.819903 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15"} err="failed to get container status \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": rpc error: code = NotFound desc = could not find container \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": container with ID starting with cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.819918 4775 scope.go:117] "RemoveContainer" containerID="164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.821183 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.821527 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4"} err="failed to get container status \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": rpc error: code = NotFound desc = could not find container \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": container with ID starting with 164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.821560 4775 scope.go:117] "RemoveContainer" containerID="c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.822936 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-config-data\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.824207 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-scripts\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.826550 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df"} err="failed to get container status \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": rpc error: code = NotFound desc = could not find container \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": container with ID starting with c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.826581 4775 scope.go:117] "RemoveContainer" containerID="1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.826830 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd"} err="failed to get container status \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": rpc error: code = NotFound desc = could not find container \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": container with ID starting with 1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.826853 4775 scope.go:117] "RemoveContainer" containerID="cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827110 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15"} err="failed to get container status \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": rpc error: code = NotFound desc = could not find container \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": container with ID starting with cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827158 4775 scope.go:117] "RemoveContainer" containerID="164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827177 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827474 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4"} err="failed to get container status \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": rpc error: code = NotFound desc = could not find container \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": container with ID starting with 164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827517 4775 scope.go:117] "RemoveContainer" containerID="c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827752 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df"} err="failed to get container status \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": rpc error: code = NotFound desc = could not find container \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": container with ID starting with c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827781 4775 scope.go:117] "RemoveContainer" containerID="1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.827939 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd"} err="failed to get container status \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": rpc error: code = NotFound desc = could not find container \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": container with ID starting with 1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828004 4775 scope.go:117] "RemoveContainer" containerID="cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828174 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15"} err="failed to get container status \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": rpc error: code = NotFound desc = could not find container \"cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15\": container with ID starting with cb0659b017e4910033d19c004ba55467d4bab3a55e36532293a3c679e65a1b15 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828192 4775 scope.go:117] "RemoveContainer" containerID="164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828317 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4"} err="failed to get container status \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": rpc error: code = NotFound desc = could not find container \"164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4\": container with ID starting with 164b265e93f722f565e909fbb3f0ec23be10adaaa065591fb95e449de812bae4 not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828334 4775 scope.go:117] "RemoveContainer" containerID="c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828514 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df"} err="failed to get container status \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": rpc error: code = NotFound desc = could not find container \"c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df\": container with ID starting with c7c2b3f236a8e3a0bee049d894987cbfbd0b1f870b3d0bc46b03890e3c90d4df not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828535 4775 scope.go:117] "RemoveContainer" containerID="1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828820 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd"} err="failed to get container status \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": rpc error: code = NotFound desc = could not find container \"1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd\": container with ID starting with 1d1d0f48f6781c6c93c3183bd9cc0e027635df2d7bac3109e409720400211fcd not found: ID does not exist" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.828840 4775 scope.go:117] "RemoveContainer" containerID="42504908b6e8629c4bfd13d446379584c5e9631e5f21f9d0d03ceb47fe02eefd" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.834193 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skgc8\" (UniqueName: \"kubernetes.io/projected/0e7f615a-76c9-440f-aee6-0d33ad750021-kube-api-access-skgc8\") pod \"ceilometer-0\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " pod="openstack/ceilometer-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.881506 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.897967 4775 scope.go:117] "RemoveContainer" containerID="156c73760afe4bfaf528d085e9a2fb00e063fb27928a61dc8179d4c23fd740db" Jan 27 11:39:01 crc kubenswrapper[4775]: I0127 11:39:01.989649 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.007340 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.323961 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34fbc599-e3e9-4317-a306-f1b4d677cd84","Type":"ContainerStarted","Data":"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a"} Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.324337 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34fbc599-e3e9-4317-a306-f1b4d677cd84","Type":"ContainerStarted","Data":"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb"} Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.324349 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-log" containerID="cri-o://bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a" gracePeriod=30 Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.325123 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-httpd" containerID="cri-o://0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb" gracePeriod=30 Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.354801 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.354787822 podStartE2EDuration="3.354787822s" podCreationTimestamp="2026-01-27 11:38:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:02.353349052 +0000 UTC m=+1121.494946839" watchObservedRunningTime="2026-01-27 11:39:02.354787822 +0000 UTC m=+1121.496385599" Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.515939 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:02 crc kubenswrapper[4775]: I0127 11:39:02.550042 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:02 crc kubenswrapper[4775]: W0127 11:39:02.572743 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e7f615a_76c9_440f_aee6_0d33ad750021.slice/crio-54fbd787f49c4f61fd7730c91b459bd75d94e94c1b796cc8a26f95081ed92545 WatchSource:0}: Error finding container 54fbd787f49c4f61fd7730c91b459bd75d94e94c1b796cc8a26f95081ed92545: Status 404 returned error can't find the container with id 54fbd787f49c4f61fd7730c91b459bd75d94e94c1b796cc8a26f95081ed92545 Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.228162 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.351289 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-combined-ca-bundle\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.351703 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-logs\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.351757 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-public-tls-certs\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.351816 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjc88\" (UniqueName: \"kubernetes.io/projected/34fbc599-e3e9-4317-a306-f1b4d677cd84-kube-api-access-xjc88\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.351841 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-scripts\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.351938 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-httpd-run\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.351956 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.352007 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-config-data\") pod \"34fbc599-e3e9-4317-a306-f1b4d677cd84\" (UID: \"34fbc599-e3e9-4317-a306-f1b4d677cd84\") " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.352166 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-logs" (OuterVolumeSpecName: "logs") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.352601 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.353352 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.353376 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34fbc599-e3e9-4317-a306-f1b4d677cd84-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.358573 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.364628 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-scripts" (OuterVolumeSpecName: "scripts") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.364638 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34fbc599-e3e9-4317-a306-f1b4d677cd84-kube-api-access-xjc88" (OuterVolumeSpecName: "kube-api-access-xjc88") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "kube-api-access-xjc88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.390258 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d4944db9-7805-486d-bd2f-38245c9eecbf","Type":"ContainerStarted","Data":"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2"} Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.390303 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d4944db9-7805-486d-bd2f-38245c9eecbf","Type":"ContainerStarted","Data":"3e083a2d7efd3e69a8c080a7d1e8f3788d6a8410341ddd6b9bdeed055744c49d"} Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.395901 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerStarted","Data":"54fbd787f49c4f61fd7730c91b459bd75d94e94c1b796cc8a26f95081ed92545"} Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.403406 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.404393 4775 generic.go:334] "Generic (PLEG): container finished" podID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerID="0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb" exitCode=0 Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.404431 4775 generic.go:334] "Generic (PLEG): container finished" podID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerID="bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a" exitCode=143 Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.404474 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34fbc599-e3e9-4317-a306-f1b4d677cd84","Type":"ContainerDied","Data":"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb"} Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.404504 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34fbc599-e3e9-4317-a306-f1b4d677cd84","Type":"ContainerDied","Data":"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a"} Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.404513 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"34fbc599-e3e9-4317-a306-f1b4d677cd84","Type":"ContainerDied","Data":"866d4ea18d0c5a4b3ffee3bd292c679cf834786becc86951c207630b9977d97c"} Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.404546 4775 scope.go:117] "RemoveContainer" containerID="0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.404920 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.428211 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.442354 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-config-data" (OuterVolumeSpecName: "config-data") pod "34fbc599-e3e9-4317-a306-f1b4d677cd84" (UID: "34fbc599-e3e9-4317-a306-f1b4d677cd84"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.443072 4775 scope.go:117] "RemoveContainer" containerID="bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.455215 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.455243 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.455252 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjc88\" (UniqueName: \"kubernetes.io/projected/34fbc599-e3e9-4317-a306-f1b4d677cd84-kube-api-access-xjc88\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.455262 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.455291 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.455302 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34fbc599-e3e9-4317-a306-f1b4d677cd84-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.473800 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.556735 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.571371 4775 scope.go:117] "RemoveContainer" containerID="0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb" Jan 27 11:39:03 crc kubenswrapper[4775]: E0127 11:39:03.572535 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb\": container with ID starting with 0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb not found: ID does not exist" containerID="0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.572580 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb"} err="failed to get container status \"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb\": rpc error: code = NotFound desc = could not find container \"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb\": container with ID starting with 0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb not found: ID does not exist" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.572611 4775 scope.go:117] "RemoveContainer" containerID="bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a" Jan 27 11:39:03 crc kubenswrapper[4775]: E0127 11:39:03.573168 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a\": container with ID starting with bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a not found: ID does not exist" containerID="bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.573201 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a"} err="failed to get container status \"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a\": rpc error: code = NotFound desc = could not find container \"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a\": container with ID starting with bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a not found: ID does not exist" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.573223 4775 scope.go:117] "RemoveContainer" containerID="0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.573651 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb"} err="failed to get container status \"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb\": rpc error: code = NotFound desc = could not find container \"0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb\": container with ID starting with 0e5e0ba5612b549c141ca534637e1af36ba888abb910f3163411ed2e136b3dbb not found: ID does not exist" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.573745 4775 scope.go:117] "RemoveContainer" containerID="bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.574393 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a"} err="failed to get container status \"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a\": rpc error: code = NotFound desc = could not find container \"bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a\": container with ID starting with bfe092c69145931418b10e70f6690260283f63da08c5477b27057986fbb9915a not found: ID does not exist" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.762622 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.782413 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.815406 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:39:03 crc kubenswrapper[4775]: E0127 11:39:03.815867 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-httpd" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.815885 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-httpd" Jan 27 11:39:03 crc kubenswrapper[4775]: E0127 11:39:03.815900 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-log" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.815908 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-log" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.816061 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-log" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.816081 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" containerName="glance-httpd" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.817105 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.819535 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.824295 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.830974 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966199 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899a9893-167d-4c9c-9495-3c663c7d0855-logs\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966262 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966302 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966338 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-config-data\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966363 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966431 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-scripts\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966597 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/899a9893-167d-4c9c-9495-3c663c7d0855-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:03 crc kubenswrapper[4775]: I0127 11:39:03.966650 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbcz2\" (UniqueName: \"kubernetes.io/projected/899a9893-167d-4c9c-9495-3c663c7d0855-kube-api-access-vbcz2\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068535 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/899a9893-167d-4c9c-9495-3c663c7d0855-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068600 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbcz2\" (UniqueName: \"kubernetes.io/projected/899a9893-167d-4c9c-9495-3c663c7d0855-kube-api-access-vbcz2\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068637 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899a9893-167d-4c9c-9495-3c663c7d0855-logs\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068657 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068684 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068713 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-config-data\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068733 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.068789 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-scripts\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.069040 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/899a9893-167d-4c9c-9495-3c663c7d0855-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.069298 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.069330 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/899a9893-167d-4c9c-9495-3c663c7d0855-logs\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.075643 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.075852 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.075984 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-config-data\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.092054 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbcz2\" (UniqueName: \"kubernetes.io/projected/899a9893-167d-4c9c-9495-3c663c7d0855-kube-api-access-vbcz2\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.115009 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.119729 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/899a9893-167d-4c9c-9495-3c663c7d0855-scripts\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.145489 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"899a9893-167d-4c9c-9495-3c663c7d0855\") " pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.417776 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d4944db9-7805-486d-bd2f-38245c9eecbf","Type":"ContainerStarted","Data":"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3"} Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.417928 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-log" containerID="cri-o://3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2" gracePeriod=30 Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.418369 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-httpd" containerID="cri-o://e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3" gracePeriod=30 Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.425906 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerStarted","Data":"511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84"} Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.425951 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerStarted","Data":"aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1"} Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.437238 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.446690 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.446673384 podStartE2EDuration="3.446673384s" podCreationTimestamp="2026-01-27 11:39:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:04.443295111 +0000 UTC m=+1123.584892888" watchObservedRunningTime="2026-01-27 11:39:04.446673384 +0000 UTC m=+1123.588271161" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.583062 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5c59c678b7-lbtkp"] Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.584908 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.596943 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c59c678b7-lbtkp"] Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.679217 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-public-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.679270 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-ovndb-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.679301 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-httpd-config\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.679372 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-combined-ca-bundle\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.679476 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-internal-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.679522 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjl8l\" (UniqueName: \"kubernetes.io/projected/857ed116-b219-4af4-9c38-69e85db0c484-kube-api-access-vjl8l\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.679538 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-config\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.781745 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-public-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.781808 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-ovndb-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.781837 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-httpd-config\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.781862 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-combined-ca-bundle\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.782129 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-internal-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.782216 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjl8l\" (UniqueName: \"kubernetes.io/projected/857ed116-b219-4af4-9c38-69e85db0c484-kube-api-access-vjl8l\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.782257 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-config\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.790673 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-public-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.791973 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-combined-ca-bundle\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.793975 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-internal-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.794211 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-ovndb-tls-certs\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.794219 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-httpd-config\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.796755 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/857ed116-b219-4af4-9c38-69e85db0c484-config\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.803771 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjl8l\" (UniqueName: \"kubernetes.io/projected/857ed116-b219-4af4-9c38-69e85db0c484-kube-api-access-vjl8l\") pod \"neutron-5c59c678b7-lbtkp\" (UID: \"857ed116-b219-4af4-9c38-69e85db0c484\") " pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:04 crc kubenswrapper[4775]: I0127 11:39:04.914624 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.056966 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.092102 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.190881 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-internal-tls-certs\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.191048 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-logs\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.191072 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz8mh\" (UniqueName: \"kubernetes.io/projected/d4944db9-7805-486d-bd2f-38245c9eecbf-kube-api-access-zz8mh\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.191097 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-config-data\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.191428 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-combined-ca-bundle\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.191478 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-scripts\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.191530 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.191548 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-httpd-run\") pod \"d4944db9-7805-486d-bd2f-38245c9eecbf\" (UID: \"d4944db9-7805-486d-bd2f-38245c9eecbf\") " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.192191 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.192382 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-logs" (OuterVolumeSpecName: "logs") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.197810 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-scripts" (OuterVolumeSpecName: "scripts") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.198858 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4944db9-7805-486d-bd2f-38245c9eecbf-kube-api-access-zz8mh" (OuterVolumeSpecName: "kube-api-access-zz8mh") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "kube-api-access-zz8mh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.202552 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.237534 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.260329 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.265801 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-config-data" (OuterVolumeSpecName: "config-data") pod "d4944db9-7805-486d-bd2f-38245c9eecbf" (UID: "d4944db9-7805-486d-bd2f-38245c9eecbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293430 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293476 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293512 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293521 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293533 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293540 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4944db9-7805-486d-bd2f-38245c9eecbf-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293548 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz8mh\" (UniqueName: \"kubernetes.io/projected/d4944db9-7805-486d-bd2f-38245c9eecbf-kube-api-access-zz8mh\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.293557 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4944db9-7805-486d-bd2f-38245c9eecbf-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.321644 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.395293 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.442046 4775 generic.go:334] "Generic (PLEG): container finished" podID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerID="e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3" exitCode=0 Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.442080 4775 generic.go:334] "Generic (PLEG): container finished" podID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerID="3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2" exitCode=143 Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.442160 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.444502 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d4944db9-7805-486d-bd2f-38245c9eecbf","Type":"ContainerDied","Data":"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3"} Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.444578 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d4944db9-7805-486d-bd2f-38245c9eecbf","Type":"ContainerDied","Data":"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2"} Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.444592 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"d4944db9-7805-486d-bd2f-38245c9eecbf","Type":"ContainerDied","Data":"3e083a2d7efd3e69a8c080a7d1e8f3788d6a8410341ddd6b9bdeed055744c49d"} Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.444608 4775 scope.go:117] "RemoveContainer" containerID="e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.458394 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerStarted","Data":"ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9"} Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.465199 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"899a9893-167d-4c9c-9495-3c663c7d0855","Type":"ContainerStarted","Data":"4b5a2197f63f7e229b480975e868d0f8d5dab8b2b247e1b010060a420354577c"} Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.543085 4775 scope.go:117] "RemoveContainer" containerID="3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.544383 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.553463 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.567703 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:05 crc kubenswrapper[4775]: E0127 11:39:05.568055 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-httpd" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.568071 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-httpd" Jan 27 11:39:05 crc kubenswrapper[4775]: E0127 11:39:05.568094 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-log" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.568100 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-log" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.568268 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-log" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.568286 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" containerName="glance-httpd" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.569180 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.572728 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.572938 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.599233 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c59c678b7-lbtkp"] Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.626607 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.641021 4775 scope.go:117] "RemoveContainer" containerID="e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3" Jan 27 11:39:05 crc kubenswrapper[4775]: E0127 11:39:05.642129 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3\": container with ID starting with e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3 not found: ID does not exist" containerID="e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.642172 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3"} err="failed to get container status \"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3\": rpc error: code = NotFound desc = could not find container \"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3\": container with ID starting with e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3 not found: ID does not exist" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.642200 4775 scope.go:117] "RemoveContainer" containerID="3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2" Jan 27 11:39:05 crc kubenswrapper[4775]: E0127 11:39:05.644033 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2\": container with ID starting with 3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2 not found: ID does not exist" containerID="3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.644078 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2"} err="failed to get container status \"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2\": rpc error: code = NotFound desc = could not find container \"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2\": container with ID starting with 3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2 not found: ID does not exist" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.644112 4775 scope.go:117] "RemoveContainer" containerID="e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.644417 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3"} err="failed to get container status \"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3\": rpc error: code = NotFound desc = could not find container \"e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3\": container with ID starting with e597d48385cf7cf03112259bdebb16bcd846d7a9ba12e2e4437a33cf628db7e3 not found: ID does not exist" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.644443 4775 scope.go:117] "RemoveContainer" containerID="3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.644829 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2"} err="failed to get container status \"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2\": rpc error: code = NotFound desc = could not find container \"3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2\": container with ID starting with 3ca62031c13377b479a2058e0b3520411622e8193b629df040a45b9bdd07b8a2 not found: ID does not exist" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705324 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7vc7\" (UniqueName: \"kubernetes.io/projected/2d8a9ef1-1171-438f-be81-89f670bd9735-kube-api-access-t7vc7\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705414 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705433 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705468 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705519 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d8a9ef1-1171-438f-be81-89f670bd9735-logs\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705548 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705617 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d8a9ef1-1171-438f-be81-89f670bd9735-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.705691 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.756851 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34fbc599-e3e9-4317-a306-f1b4d677cd84" path="/var/lib/kubelet/pods/34fbc599-e3e9-4317-a306-f1b4d677cd84/volumes" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.757714 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4944db9-7805-486d-bd2f-38245c9eecbf" path="/var/lib/kubelet/pods/d4944db9-7805-486d-bd2f-38245c9eecbf/volumes" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.806854 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.806915 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d8a9ef1-1171-438f-be81-89f670bd9735-logs\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.806954 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.807097 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d8a9ef1-1171-438f-be81-89f670bd9735-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.807145 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.807191 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7vc7\" (UniqueName: \"kubernetes.io/projected/2d8a9ef1-1171-438f-be81-89f670bd9735-kube-api-access-t7vc7\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.807269 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.807289 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.808977 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d8a9ef1-1171-438f-be81-89f670bd9735-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.809923 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.810783 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d8a9ef1-1171-438f-be81-89f670bd9735-logs\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.813255 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.815826 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.816052 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.830438 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d8a9ef1-1171-438f-be81-89f670bd9735-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.841198 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7vc7\" (UniqueName: \"kubernetes.io/projected/2d8a9ef1-1171-438f-be81-89f670bd9735-kube-api-access-t7vc7\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.879890 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d8a9ef1-1171-438f-be81-89f670bd9735\") " pod="openstack/glance-default-internal-api-0" Jan 27 11:39:05 crc kubenswrapper[4775]: I0127 11:39:05.902763 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:06 crc kubenswrapper[4775]: I0127 11:39:06.477952 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"899a9893-167d-4c9c-9495-3c663c7d0855","Type":"ContainerStarted","Data":"c86790369dd857d1339351cb0cc3b769915d8acc3e30cef67b677d972658fab5"} Jan 27 11:39:06 crc kubenswrapper[4775]: I0127 11:39:06.480931 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c59c678b7-lbtkp" event={"ID":"857ed116-b219-4af4-9c38-69e85db0c484","Type":"ContainerStarted","Data":"60553b81c31faad82df95aaef48f5879e450ffeefbcf8061f9e527a356e9485b"} Jan 27 11:39:06 crc kubenswrapper[4775]: I0127 11:39:06.480956 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c59c678b7-lbtkp" event={"ID":"857ed116-b219-4af4-9c38-69e85db0c484","Type":"ContainerStarted","Data":"f13b1d6e397378d1eee56a32024bde6e8c9531482a5a4d55f375e52235aecf63"} Jan 27 11:39:06 crc kubenswrapper[4775]: I0127 11:39:06.480966 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c59c678b7-lbtkp" event={"ID":"857ed116-b219-4af4-9c38-69e85db0c484","Type":"ContainerStarted","Data":"19c5e38a7afe1201ed2fcf03488fde5358e005044bc9b2f06a799a829ad93eda"} Jan 27 11:39:06 crc kubenswrapper[4775]: I0127 11:39:06.481652 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:06 crc kubenswrapper[4775]: I0127 11:39:06.514678 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5c59c678b7-lbtkp" podStartSLOduration=2.514654335 podStartE2EDuration="2.514654335s" podCreationTimestamp="2026-01-27 11:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:06.500353749 +0000 UTC m=+1125.641951526" watchObservedRunningTime="2026-01-27 11:39:06.514654335 +0000 UTC m=+1125.656252112" Jan 27 11:39:06 crc kubenswrapper[4775]: I0127 11:39:06.547239 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.497934 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerStarted","Data":"fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2"} Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.498375 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-central-agent" containerID="cri-o://aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1" gracePeriod=30 Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.498662 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.498900 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="proxy-httpd" containerID="cri-o://fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2" gracePeriod=30 Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.498958 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="sg-core" containerID="cri-o://ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9" gracePeriod=30 Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.498994 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-notification-agent" containerID="cri-o://511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84" gracePeriod=30 Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.514900 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"899a9893-167d-4c9c-9495-3c663c7d0855","Type":"ContainerStarted","Data":"549b5fac54367f7d975358805a6394a66ed3b495f484400d7487054f1267fe23"} Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.532586 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d8a9ef1-1171-438f-be81-89f670bd9735","Type":"ContainerStarted","Data":"8afdb7a742646a4746b16324c837e645ca5009f0a8b343509ea9298ff6e74d52"} Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.532645 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d8a9ef1-1171-438f-be81-89f670bd9735","Type":"ContainerStarted","Data":"a568fb9e5bd89e96bdc6bd85d9565d6d4542a090a3f31f684a01c2bb795fc74c"} Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.534343 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.991628057 podStartE2EDuration="6.534320359s" podCreationTimestamp="2026-01-27 11:39:01 +0000 UTC" firstStartedPulling="2026-01-27 11:39:02.578176548 +0000 UTC m=+1121.719774325" lastFinishedPulling="2026-01-27 11:39:07.12086885 +0000 UTC m=+1126.262466627" observedRunningTime="2026-01-27 11:39:07.523186281 +0000 UTC m=+1126.664784058" watchObservedRunningTime="2026-01-27 11:39:07.534320359 +0000 UTC m=+1126.675918126" Jan 27 11:39:07 crc kubenswrapper[4775]: I0127 11:39:07.555783 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.555760462 podStartE2EDuration="4.555760462s" podCreationTimestamp="2026-01-27 11:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:07.543860962 +0000 UTC m=+1126.685458729" watchObservedRunningTime="2026-01-27 11:39:07.555760462 +0000 UTC m=+1126.697358239" Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.007850 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.491552 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d66b74d76-ngwn9" Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.570874 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8bc6678d8-674l9"] Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.571110 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8bc6678d8-674l9" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api-log" containerID="cri-o://9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b" gracePeriod=30 Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.571234 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-8bc6678d8-674l9" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api" containerID="cri-o://88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c" gracePeriod=30 Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.573531 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d8a9ef1-1171-438f-be81-89f670bd9735","Type":"ContainerStarted","Data":"af0056b03ab936b02a4f430068d2d283b8afd820daef0e50054f68024a0623c1"} Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.595682 4775 generic.go:334] "Generic (PLEG): container finished" podID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerID="fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2" exitCode=0 Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.595716 4775 generic.go:334] "Generic (PLEG): container finished" podID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerID="ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9" exitCode=2 Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.595725 4775 generic.go:334] "Generic (PLEG): container finished" podID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerID="511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84" exitCode=0 Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.596525 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerDied","Data":"fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2"} Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.596549 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerDied","Data":"ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9"} Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.596560 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerDied","Data":"511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84"} Jan 27 11:39:08 crc kubenswrapper[4775]: I0127 11:39:08.608896 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.608878362 podStartE2EDuration="3.608878362s" podCreationTimestamp="2026-01-27 11:39:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:08.598228177 +0000 UTC m=+1127.739825944" watchObservedRunningTime="2026-01-27 11:39:08.608878362 +0000 UTC m=+1127.750476139" Jan 27 11:39:08 crc kubenswrapper[4775]: E0127 11:39:08.766252 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59717e39_e3c7_40b2_89c7_7b898f3b72e7.slice/crio-conmon-9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod59717e39_e3c7_40b2_89c7_7b898f3b72e7.slice/crio-9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b.scope\": RecentStats: unable to find data in memory cache]" Jan 27 11:39:09 crc kubenswrapper[4775]: I0127 11:39:09.639166 4775 generic.go:334] "Generic (PLEG): container finished" podID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerID="9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b" exitCode=143 Jan 27 11:39:09 crc kubenswrapper[4775]: I0127 11:39:09.639487 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8bc6678d8-674l9" event={"ID":"59717e39-e3c7-40b2-89c7-7b898f3b72e7","Type":"ContainerDied","Data":"9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b"} Jan 27 11:39:10 crc kubenswrapper[4775]: I0127 11:39:10.648867 4775 generic.go:334] "Generic (PLEG): container finished" podID="1b5e7b0a-a4d0-4c64-b273-2b47230efd17" containerID="cd7130b87032009eafbd9299811458b2c0b7a08141bac0e7bfbe791fc49ad4d0" exitCode=0 Jan 27 11:39:10 crc kubenswrapper[4775]: I0127 11:39:10.648943 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" event={"ID":"1b5e7b0a-a4d0-4c64-b273-2b47230efd17","Type":"ContainerDied","Data":"cd7130b87032009eafbd9299811458b2c0b7a08141bac0e7bfbe791fc49ad4d0"} Jan 27 11:39:11 crc kubenswrapper[4775]: I0127 11:39:11.851493 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8bc6678d8-674l9" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": dial tcp 10.217.0.164:9311: connect: connection refused" Jan 27 11:39:11 crc kubenswrapper[4775]: I0127 11:39:11.854166 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-8bc6678d8-674l9" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.164:9311/healthcheck\": dial tcp 10.217.0.164:9311: connect: connection refused" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.006637 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.083665 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-combined-ca-bundle\") pod \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.083752 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-config-data\") pod \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.083792 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbw4n\" (UniqueName: \"kubernetes.io/projected/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-kube-api-access-vbw4n\") pod \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.083893 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-scripts\") pod \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\" (UID: \"1b5e7b0a-a4d0-4c64-b273-2b47230efd17\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.088749 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-kube-api-access-vbw4n" (OuterVolumeSpecName: "kube-api-access-vbw4n") pod "1b5e7b0a-a4d0-4c64-b273-2b47230efd17" (UID: "1b5e7b0a-a4d0-4c64-b273-2b47230efd17"). InnerVolumeSpecName "kube-api-access-vbw4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.092361 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-scripts" (OuterVolumeSpecName: "scripts") pod "1b5e7b0a-a4d0-4c64-b273-2b47230efd17" (UID: "1b5e7b0a-a4d0-4c64-b273-2b47230efd17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.115308 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-config-data" (OuterVolumeSpecName: "config-data") pod "1b5e7b0a-a4d0-4c64-b273-2b47230efd17" (UID: "1b5e7b0a-a4d0-4c64-b273-2b47230efd17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.116374 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b5e7b0a-a4d0-4c64-b273-2b47230efd17" (UID: "1b5e7b0a-a4d0-4c64-b273-2b47230efd17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.185901 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.185947 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.185962 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbw4n\" (UniqueName: \"kubernetes.io/projected/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-kube-api-access-vbw4n\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.185977 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1b5e7b0a-a4d0-4c64-b273-2b47230efd17-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.211666 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.343054 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.390086 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-public-tls-certs\") pod \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.390164 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data\") pod \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.390200 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsdsn\" (UniqueName: \"kubernetes.io/projected/59717e39-e3c7-40b2-89c7-7b898f3b72e7-kube-api-access-jsdsn\") pod \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.390351 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-internal-tls-certs\") pod \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.390383 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59717e39-e3c7-40b2-89c7-7b898f3b72e7-logs\") pod \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.390405 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data-custom\") pod \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.390444 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-combined-ca-bundle\") pod \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\" (UID: \"59717e39-e3c7-40b2-89c7-7b898f3b72e7\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.393218 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59717e39-e3c7-40b2-89c7-7b898f3b72e7-logs" (OuterVolumeSpecName: "logs") pod "59717e39-e3c7-40b2-89c7-7b898f3b72e7" (UID: "59717e39-e3c7-40b2-89c7-7b898f3b72e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.399546 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "59717e39-e3c7-40b2-89c7-7b898f3b72e7" (UID: "59717e39-e3c7-40b2-89c7-7b898f3b72e7"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.399784 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59717e39-e3c7-40b2-89c7-7b898f3b72e7-kube-api-access-jsdsn" (OuterVolumeSpecName: "kube-api-access-jsdsn") pod "59717e39-e3c7-40b2-89c7-7b898f3b72e7" (UID: "59717e39-e3c7-40b2-89c7-7b898f3b72e7"). InnerVolumeSpecName "kube-api-access-jsdsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.438444 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "59717e39-e3c7-40b2-89c7-7b898f3b72e7" (UID: "59717e39-e3c7-40b2-89c7-7b898f3b72e7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.442629 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "59717e39-e3c7-40b2-89c7-7b898f3b72e7" (UID: "59717e39-e3c7-40b2-89c7-7b898f3b72e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.444289 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data" (OuterVolumeSpecName: "config-data") pod "59717e39-e3c7-40b2-89c7-7b898f3b72e7" (UID: "59717e39-e3c7-40b2-89c7-7b898f3b72e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.445769 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "59717e39-e3c7-40b2-89c7-7b898f3b72e7" (UID: "59717e39-e3c7-40b2-89c7-7b898f3b72e7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492354 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-log-httpd\") pod \"0e7f615a-76c9-440f-aee6-0d33ad750021\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492422 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-combined-ca-bundle\") pod \"0e7f615a-76c9-440f-aee6-0d33ad750021\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492483 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-sg-core-conf-yaml\") pod \"0e7f615a-76c9-440f-aee6-0d33ad750021\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492516 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-run-httpd\") pod \"0e7f615a-76c9-440f-aee6-0d33ad750021\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492618 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-config-data\") pod \"0e7f615a-76c9-440f-aee6-0d33ad750021\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492644 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skgc8\" (UniqueName: \"kubernetes.io/projected/0e7f615a-76c9-440f-aee6-0d33ad750021-kube-api-access-skgc8\") pod \"0e7f615a-76c9-440f-aee6-0d33ad750021\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492670 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-scripts\") pod \"0e7f615a-76c9-440f-aee6-0d33ad750021\" (UID: \"0e7f615a-76c9-440f-aee6-0d33ad750021\") " Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.492818 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0e7f615a-76c9-440f-aee6-0d33ad750021" (UID: "0e7f615a-76c9-440f-aee6-0d33ad750021"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493064 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493086 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493098 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/59717e39-e3c7-40b2-89c7-7b898f3b72e7-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493106 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493115 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493124 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493132 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/59717e39-e3c7-40b2-89c7-7b898f3b72e7-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493140 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsdsn\" (UniqueName: \"kubernetes.io/projected/59717e39-e3c7-40b2-89c7-7b898f3b72e7-kube-api-access-jsdsn\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.493112 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0e7f615a-76c9-440f-aee6-0d33ad750021" (UID: "0e7f615a-76c9-440f-aee6-0d33ad750021"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.495885 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e7f615a-76c9-440f-aee6-0d33ad750021-kube-api-access-skgc8" (OuterVolumeSpecName: "kube-api-access-skgc8") pod "0e7f615a-76c9-440f-aee6-0d33ad750021" (UID: "0e7f615a-76c9-440f-aee6-0d33ad750021"). InnerVolumeSpecName "kube-api-access-skgc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.497206 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-scripts" (OuterVolumeSpecName: "scripts") pod "0e7f615a-76c9-440f-aee6-0d33ad750021" (UID: "0e7f615a-76c9-440f-aee6-0d33ad750021"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.515939 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0e7f615a-76c9-440f-aee6-0d33ad750021" (UID: "0e7f615a-76c9-440f-aee6-0d33ad750021"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.559375 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e7f615a-76c9-440f-aee6-0d33ad750021" (UID: "0e7f615a-76c9-440f-aee6-0d33ad750021"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.594549 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skgc8\" (UniqueName: \"kubernetes.io/projected/0e7f615a-76c9-440f-aee6-0d33ad750021-kube-api-access-skgc8\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.594585 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.594594 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.594602 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.594612 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e7f615a-76c9-440f-aee6-0d33ad750021-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.595035 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-config-data" (OuterVolumeSpecName: "config-data") pod "0e7f615a-76c9-440f-aee6-0d33ad750021" (UID: "0e7f615a-76c9-440f-aee6-0d33ad750021"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.667046 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" event={"ID":"1b5e7b0a-a4d0-4c64-b273-2b47230efd17","Type":"ContainerDied","Data":"d9265de06875404ccb8f671d76f819216e0e3fe1c45b67872bef4047a61868b0"} Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.667399 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9265de06875404ccb8f671d76f819216e0e3fe1c45b67872bef4047a61868b0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.667081 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-6bh7g" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.669932 4775 generic.go:334] "Generic (PLEG): container finished" podID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerID="88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c" exitCode=0 Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.670001 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8bc6678d8-674l9" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.670006 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8bc6678d8-674l9" event={"ID":"59717e39-e3c7-40b2-89c7-7b898f3b72e7","Type":"ContainerDied","Data":"88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c"} Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.670118 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8bc6678d8-674l9" event={"ID":"59717e39-e3c7-40b2-89c7-7b898f3b72e7","Type":"ContainerDied","Data":"7741a0906f599fd7687720fdb78021f6c23a07fdd0533bbdc83dc1e97a16a161"} Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.670138 4775 scope.go:117] "RemoveContainer" containerID="88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.673703 4775 generic.go:334] "Generic (PLEG): container finished" podID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerID="aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1" exitCode=0 Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.673749 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerDied","Data":"aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1"} Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.673775 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0e7f615a-76c9-440f-aee6-0d33ad750021","Type":"ContainerDied","Data":"54fbd787f49c4f61fd7730c91b459bd75d94e94c1b796cc8a26f95081ed92545"} Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.673800 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.696357 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e7f615a-76c9-440f-aee6-0d33ad750021-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.731381 4775 scope.go:117] "RemoveContainer" containerID="9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.732787 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-8bc6678d8-674l9"] Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.743397 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-8bc6678d8-674l9"] Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.753340 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.766765 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.766831 4775 scope.go:117] "RemoveContainer" containerID="88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.767904 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c\": container with ID starting with 88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c not found: ID does not exist" containerID="88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.767970 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c"} err="failed to get container status \"88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c\": rpc error: code = NotFound desc = could not find container \"88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c\": container with ID starting with 88fca55e218e5b497584c403e4713c547a9bcaf708edf63fb61838c02b106c7c not found: ID does not exist" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.768018 4775 scope.go:117] "RemoveContainer" containerID="9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.768508 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b\": container with ID starting with 9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b not found: ID does not exist" containerID="9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.768531 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b"} err="failed to get container status \"9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b\": rpc error: code = NotFound desc = could not find container \"9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b\": container with ID starting with 9b581a8d58db2115a9c7b3fa5a7679c83840e7080fd44b6f470c39f570b3656b not found: ID does not exist" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.768546 4775 scope.go:117] "RemoveContainer" containerID="fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.778434 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.778884 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-notification-agent" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.778907 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-notification-agent" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.778932 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.778941 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.778953 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="sg-core" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.778961 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="sg-core" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.778977 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-central-agent" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.778986 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-central-agent" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.779009 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="proxy-httpd" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779017 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="proxy-httpd" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.779042 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5e7b0a-a4d0-4c64-b273-2b47230efd17" containerName="nova-cell0-conductor-db-sync" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779050 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5e7b0a-a4d0-4c64-b273-2b47230efd17" containerName="nova-cell0-conductor-db-sync" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.779060 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api-log" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779068 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api-log" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779312 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api-log" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779334 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" containerName="barbican-api" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779352 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="sg-core" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779365 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-central-agent" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779378 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="proxy-httpd" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779389 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" containerName="ceilometer-notification-agent" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.779404 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5e7b0a-a4d0-4c64-b273-2b47230efd17" containerName="nova-cell0-conductor-db-sync" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.782098 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.787072 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.787930 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.793446 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.811799 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.815669 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.837938 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kp5gz" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.838445 4775 scope.go:117] "RemoveContainer" containerID="ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.840776 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.847941 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.899720 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.899828 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.899896 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-run-httpd\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.899923 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.899948 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtljn\" (UniqueName: \"kubernetes.io/projected/f3ab198a-6671-407e-931d-e1e6dc109197-kube-api-access-rtljn\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.900007 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhrxz\" (UniqueName: \"kubernetes.io/projected/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-kube-api-access-mhrxz\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.900030 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-log-httpd\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.900070 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.900110 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-config-data\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.900160 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-scripts\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.917694 4775 scope.go:117] "RemoveContainer" containerID="511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.946555 4775 scope.go:117] "RemoveContainer" containerID="aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.966013 4775 scope.go:117] "RemoveContainer" containerID="fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.966441 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2\": container with ID starting with fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2 not found: ID does not exist" containerID="fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.966516 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2"} err="failed to get container status \"fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2\": rpc error: code = NotFound desc = could not find container \"fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2\": container with ID starting with fc840499440ee1d82764de75f6e733b53ec287be807edac66edf4439e965f2a2 not found: ID does not exist" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.966547 4775 scope.go:117] "RemoveContainer" containerID="ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.967681 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9\": container with ID starting with ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9 not found: ID does not exist" containerID="ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.967719 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9"} err="failed to get container status \"ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9\": rpc error: code = NotFound desc = could not find container \"ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9\": container with ID starting with ac5e864c2f9811c52d7a14b4122af121ec4ae26ee02d4db644c4d9d91d420ef9 not found: ID does not exist" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.967741 4775 scope.go:117] "RemoveContainer" containerID="511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.968226 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84\": container with ID starting with 511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84 not found: ID does not exist" containerID="511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.968255 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84"} err="failed to get container status \"511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84\": rpc error: code = NotFound desc = could not find container \"511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84\": container with ID starting with 511c580b7c1c366b0946b2da61db9af26a5d0d128baf5bdda90ac1ffac6d3f84 not found: ID does not exist" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.968277 4775 scope.go:117] "RemoveContainer" containerID="aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1" Jan 27 11:39:12 crc kubenswrapper[4775]: E0127 11:39:12.968526 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1\": container with ID starting with aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1 not found: ID does not exist" containerID="aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1" Jan 27 11:39:12 crc kubenswrapper[4775]: I0127 11:39:12.968549 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1"} err="failed to get container status \"aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1\": rpc error: code = NotFound desc = could not find container \"aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1\": container with ID starting with aefd057f21e17c6c7ae21802d8e33e1ee004b69b423451cca389d1c6ac4af9c1 not found: ID does not exist" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001614 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001698 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001733 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-run-httpd\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001755 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001777 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtljn\" (UniqueName: \"kubernetes.io/projected/f3ab198a-6671-407e-931d-e1e6dc109197-kube-api-access-rtljn\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001830 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhrxz\" (UniqueName: \"kubernetes.io/projected/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-kube-api-access-mhrxz\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001853 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-log-httpd\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001881 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001916 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-config-data\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.001949 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-scripts\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.002507 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-run-httpd\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.002967 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-log-httpd\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.007369 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.007387 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-scripts\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.007910 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-config-data\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.008343 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.008674 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.008967 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.026074 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtljn\" (UniqueName: \"kubernetes.io/projected/f3ab198a-6671-407e-931d-e1e6dc109197-kube-api-access-rtljn\") pod \"ceilometer-0\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.032772 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhrxz\" (UniqueName: \"kubernetes.io/projected/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-kube-api-access-mhrxz\") pod \"nova-cell0-conductor-0\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.226784 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.237211 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.758894 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e7f615a-76c9-440f-aee6-0d33ad750021" path="/var/lib/kubelet/pods/0e7f615a-76c9-440f-aee6-0d33ad750021/volumes" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.760135 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59717e39-e3c7-40b2-89c7-7b898f3b72e7" path="/var/lib/kubelet/pods/59717e39-e3c7-40b2-89c7-7b898f3b72e7/volumes" Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.779154 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.795611 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:13 crc kubenswrapper[4775]: I0127 11:39:13.852494 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.438608 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.438841 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.486386 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.490275 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.535052 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.693623 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9","Type":"ContainerStarted","Data":"cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6"} Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.693665 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9","Type":"ContainerStarted","Data":"82d4d61311885172aa8b3e5cc80375eb709a13d1d92b08eb5c2530bda351308b"} Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.693770 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" containerID="cri-o://cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" gracePeriod=30 Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.694639 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.698676 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerStarted","Data":"2a2ee9ecd020ed63d838c367608617b5c5b9bef053fb9d27e529ac66f6e55c5a"} Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.698720 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.699033 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 27 11:39:14 crc kubenswrapper[4775]: I0127 11:39:14.709512 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.709492832 podStartE2EDuration="2.709492832s" podCreationTimestamp="2026-01-27 11:39:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:14.708471703 +0000 UTC m=+1133.850069490" watchObservedRunningTime="2026-01-27 11:39:14.709492832 +0000 UTC m=+1133.851090609" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.175980 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f49dbf586-l2cmp"] Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.177741 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.203269 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f49dbf586-l2cmp"] Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.242556 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b3edac4-ba7b-4c93-b66f-43ab468d290f-logs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.242639 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-combined-ca-bundle\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.242727 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfs9b\" (UniqueName: \"kubernetes.io/projected/2b3edac4-ba7b-4c93-b66f-43ab468d290f-kube-api-access-pfs9b\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.242767 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-scripts\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.242804 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-public-tls-certs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.242830 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-internal-tls-certs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.242863 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-config-data\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.343956 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b3edac4-ba7b-4c93-b66f-43ab468d290f-logs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.344200 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-combined-ca-bundle\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.344491 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfs9b\" (UniqueName: \"kubernetes.io/projected/2b3edac4-ba7b-4c93-b66f-43ab468d290f-kube-api-access-pfs9b\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.345142 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-scripts\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.345264 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-public-tls-certs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.345377 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-internal-tls-certs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.345492 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-config-data\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.344517 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2b3edac4-ba7b-4c93-b66f-43ab468d290f-logs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.349300 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-combined-ca-bundle\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.350236 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-scripts\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.350905 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-internal-tls-certs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.353367 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-config-data\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.354923 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2b3edac4-ba7b-4c93-b66f-43ab468d290f-public-tls-certs\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.359822 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfs9b\" (UniqueName: \"kubernetes.io/projected/2b3edac4-ba7b-4c93-b66f-43ab468d290f-kube-api-access-pfs9b\") pod \"placement-f49dbf586-l2cmp\" (UID: \"2b3edac4-ba7b-4c93-b66f-43ab468d290f\") " pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.494931 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.714885 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerStarted","Data":"5c3d79aab2eaf39741cf0a1a88cf8bdc2458d431fe6b12dc6778f596671b970c"} Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.715237 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerStarted","Data":"19dbb05fee4e0f091562b6f8390365f161f03f64f8035720d6e2c940618fe907"} Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.904126 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.906123 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.946595 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.972636 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f49dbf586-l2cmp"] Jan 27 11:39:15 crc kubenswrapper[4775]: I0127 11:39:15.993866 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.724611 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerStarted","Data":"b7f67772ea6767fe5e5ebb612038b7900a441fee4eef11de26a544a863c1564c"} Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.726342 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f49dbf586-l2cmp" event={"ID":"2b3edac4-ba7b-4c93-b66f-43ab468d290f","Type":"ContainerStarted","Data":"8e6019244658e5cfc5a7245dc638e28477faa58477eb7d39abb672b26b32efcb"} Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.726441 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f49dbf586-l2cmp" event={"ID":"2b3edac4-ba7b-4c93-b66f-43ab468d290f","Type":"ContainerStarted","Data":"875c83b883fa3bd728bd63d7a970181ee212cc95fecb472fd0f7adc7fa462bcb"} Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.726822 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f49dbf586-l2cmp" event={"ID":"2b3edac4-ba7b-4c93-b66f-43ab468d290f","Type":"ContainerStarted","Data":"62ecae1c1001fdf4b7f185c0a5db15c9dc33752f0d26ed2714229213f160919a"} Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.726917 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.727000 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.746398 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.746494 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.752726 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-f49dbf586-l2cmp" podStartSLOduration=1.752702127 podStartE2EDuration="1.752702127s" podCreationTimestamp="2026-01-27 11:39:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:16.75101122 +0000 UTC m=+1135.892608997" watchObservedRunningTime="2026-01-27 11:39:16.752702127 +0000 UTC m=+1135.894299904" Jan 27 11:39:16 crc kubenswrapper[4775]: I0127 11:39:16.753287 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 27 11:39:17 crc kubenswrapper[4775]: I0127 11:39:17.734940 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:17 crc kubenswrapper[4775]: I0127 11:39:17.734988 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.749997 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-central-agent" containerID="cri-o://19dbb05fee4e0f091562b6f8390365f161f03f64f8035720d6e2c940618fe907" gracePeriod=30 Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.750483 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerStarted","Data":"87c0c670f987fb5b699e39f1152f819ebcf54f73b798b5259ff6a7b344f01fb9"} Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.750528 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.750655 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="proxy-httpd" containerID="cri-o://87c0c670f987fb5b699e39f1152f819ebcf54f73b798b5259ff6a7b344f01fb9" gracePeriod=30 Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.750726 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="sg-core" containerID="cri-o://b7f67772ea6767fe5e5ebb612038b7900a441fee4eef11de26a544a863c1564c" gracePeriod=30 Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.750784 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-notification-agent" containerID="cri-o://5c3d79aab2eaf39741cf0a1a88cf8bdc2458d431fe6b12dc6778f596671b970c" gracePeriod=30 Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.773537 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.773818 4775 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.783090 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.207016299 podStartE2EDuration="6.783072256s" podCreationTimestamp="2026-01-27 11:39:12 +0000 UTC" firstStartedPulling="2026-01-27 11:39:13.802173529 +0000 UTC m=+1132.943771306" lastFinishedPulling="2026-01-27 11:39:18.378229476 +0000 UTC m=+1137.519827263" observedRunningTime="2026-01-27 11:39:18.780194376 +0000 UTC m=+1137.921792173" watchObservedRunningTime="2026-01-27 11:39:18.783072256 +0000 UTC m=+1137.924670043" Jan 27 11:39:18 crc kubenswrapper[4775]: I0127 11:39:18.824275 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 27 11:39:19 crc kubenswrapper[4775]: I0127 11:39:19.760414 4775 generic.go:334] "Generic (PLEG): container finished" podID="f3ab198a-6671-407e-931d-e1e6dc109197" containerID="b7f67772ea6767fe5e5ebb612038b7900a441fee4eef11de26a544a863c1564c" exitCode=2 Jan 27 11:39:19 crc kubenswrapper[4775]: I0127 11:39:19.760847 4775 generic.go:334] "Generic (PLEG): container finished" podID="f3ab198a-6671-407e-931d-e1e6dc109197" containerID="5c3d79aab2eaf39741cf0a1a88cf8bdc2458d431fe6b12dc6778f596671b970c" exitCode=0 Jan 27 11:39:19 crc kubenswrapper[4775]: I0127 11:39:19.760566 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerDied","Data":"b7f67772ea6767fe5e5ebb612038b7900a441fee4eef11de26a544a863c1564c"} Jan 27 11:39:19 crc kubenswrapper[4775]: I0127 11:39:19.761769 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerDied","Data":"5c3d79aab2eaf39741cf0a1a88cf8bdc2458d431fe6b12dc6778f596671b970c"} Jan 27 11:39:21 crc kubenswrapper[4775]: I0127 11:39:21.779264 4775 generic.go:334] "Generic (PLEG): container finished" podID="f3ab198a-6671-407e-931d-e1e6dc109197" containerID="19dbb05fee4e0f091562b6f8390365f161f03f64f8035720d6e2c940618fe907" exitCode=0 Jan 27 11:39:21 crc kubenswrapper[4775]: I0127 11:39:21.779310 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerDied","Data":"19dbb05fee4e0f091562b6f8390365f161f03f64f8035720d6e2c940618fe907"} Jan 27 11:39:23 crc kubenswrapper[4775]: E0127 11:39:23.239500 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:23 crc kubenswrapper[4775]: E0127 11:39:23.241151 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:23 crc kubenswrapper[4775]: E0127 11:39:23.243029 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:23 crc kubenswrapper[4775]: E0127 11:39:23.243123 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:28 crc kubenswrapper[4775]: E0127 11:39:28.239685 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:28 crc kubenswrapper[4775]: E0127 11:39:28.241597 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:28 crc kubenswrapper[4775]: E0127 11:39:28.243019 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:28 crc kubenswrapper[4775]: E0127 11:39:28.243049 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:29 crc kubenswrapper[4775]: E0127 11:39:29.333531 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9862a859_ad75_4071_ad9a_ec926175e46d.slice/crio-conmon-5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba.scope\": RecentStats: unable to find data in memory cache]" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.443689 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.448547 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.518490 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.518567 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.522276 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-combined-ca-bundle\") pod \"31617f30-7431-401d-8c41-230d6a49ff72\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523077 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvqr2\" (UniqueName: \"kubernetes.io/projected/9862a859-ad75-4071-ad9a-ec926175e46d-kube-api-access-hvqr2\") pod \"9862a859-ad75-4071-ad9a-ec926175e46d\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523117 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9862a859-ad75-4071-ad9a-ec926175e46d-logs\") pod \"9862a859-ad75-4071-ad9a-ec926175e46d\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523165 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-combined-ca-bundle\") pod \"9862a859-ad75-4071-ad9a-ec926175e46d\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523258 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data-custom\") pod \"9862a859-ad75-4071-ad9a-ec926175e46d\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523656 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data-custom\") pod \"31617f30-7431-401d-8c41-230d6a49ff72\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523704 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31617f30-7431-401d-8c41-230d6a49ff72-logs\") pod \"31617f30-7431-401d-8c41-230d6a49ff72\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523767 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data\") pod \"31617f30-7431-401d-8c41-230d6a49ff72\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523805 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data\") pod \"9862a859-ad75-4071-ad9a-ec926175e46d\" (UID: \"9862a859-ad75-4071-ad9a-ec926175e46d\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523853 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zn9s\" (UniqueName: \"kubernetes.io/projected/31617f30-7431-401d-8c41-230d6a49ff72-kube-api-access-4zn9s\") pod \"31617f30-7431-401d-8c41-230d6a49ff72\" (UID: \"31617f30-7431-401d-8c41-230d6a49ff72\") " Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.523915 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9862a859-ad75-4071-ad9a-ec926175e46d-logs" (OuterVolumeSpecName: "logs") pod "9862a859-ad75-4071-ad9a-ec926175e46d" (UID: "9862a859-ad75-4071-ad9a-ec926175e46d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.524300 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9862a859-ad75-4071-ad9a-ec926175e46d-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.525155 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31617f30-7431-401d-8c41-230d6a49ff72-logs" (OuterVolumeSpecName: "logs") pod "31617f30-7431-401d-8c41-230d6a49ff72" (UID: "31617f30-7431-401d-8c41-230d6a49ff72"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.530644 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31617f30-7431-401d-8c41-230d6a49ff72-kube-api-access-4zn9s" (OuterVolumeSpecName: "kube-api-access-4zn9s") pod "31617f30-7431-401d-8c41-230d6a49ff72" (UID: "31617f30-7431-401d-8c41-230d6a49ff72"). InnerVolumeSpecName "kube-api-access-4zn9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.532649 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "31617f30-7431-401d-8c41-230d6a49ff72" (UID: "31617f30-7431-401d-8c41-230d6a49ff72"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.541639 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9862a859-ad75-4071-ad9a-ec926175e46d-kube-api-access-hvqr2" (OuterVolumeSpecName: "kube-api-access-hvqr2") pod "9862a859-ad75-4071-ad9a-ec926175e46d" (UID: "9862a859-ad75-4071-ad9a-ec926175e46d"). InnerVolumeSpecName "kube-api-access-hvqr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.541716 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9862a859-ad75-4071-ad9a-ec926175e46d" (UID: "9862a859-ad75-4071-ad9a-ec926175e46d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.547138 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31617f30-7431-401d-8c41-230d6a49ff72" (UID: "31617f30-7431-401d-8c41-230d6a49ff72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.552381 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9862a859-ad75-4071-ad9a-ec926175e46d" (UID: "9862a859-ad75-4071-ad9a-ec926175e46d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.574291 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data" (OuterVolumeSpecName: "config-data") pod "31617f30-7431-401d-8c41-230d6a49ff72" (UID: "31617f30-7431-401d-8c41-230d6a49ff72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.580772 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data" (OuterVolumeSpecName: "config-data") pod "9862a859-ad75-4071-ad9a-ec926175e46d" (UID: "9862a859-ad75-4071-ad9a-ec926175e46d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626404 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626545 4775 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626559 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31617f30-7431-401d-8c41-230d6a49ff72-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626574 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626585 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626597 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zn9s\" (UniqueName: \"kubernetes.io/projected/31617f30-7431-401d-8c41-230d6a49ff72-kube-api-access-4zn9s\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626610 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31617f30-7431-401d-8c41-230d6a49ff72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626620 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvqr2\" (UniqueName: \"kubernetes.io/projected/9862a859-ad75-4071-ad9a-ec926175e46d-kube-api-access-hvqr2\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.626631 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9862a859-ad75-4071-ad9a-ec926175e46d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.851606 4775 generic.go:334] "Generic (PLEG): container finished" podID="9862a859-ad75-4071-ad9a-ec926175e46d" containerID="5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba" exitCode=137 Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.851673 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6695647446-72d6k" event={"ID":"9862a859-ad75-4071-ad9a-ec926175e46d","Type":"ContainerDied","Data":"5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba"} Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.851766 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6695647446-72d6k" event={"ID":"9862a859-ad75-4071-ad9a-ec926175e46d","Type":"ContainerDied","Data":"c2a8847ef3756637a0ac2e98b536e6dfeb366c6e1256763e5e2606e3b7895d3a"} Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.851814 4775 scope.go:117] "RemoveContainer" containerID="5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.851813 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6695647446-72d6k" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.854518 4775 generic.go:334] "Generic (PLEG): container finished" podID="31617f30-7431-401d-8c41-230d6a49ff72" containerID="7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f" exitCode=137 Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.854577 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" event={"ID":"31617f30-7431-401d-8c41-230d6a49ff72","Type":"ContainerDied","Data":"7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f"} Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.854608 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" event={"ID":"31617f30-7431-401d-8c41-230d6a49ff72","Type":"ContainerDied","Data":"94dd9f79f758d901a5bff17b96dea4bc02bd0921b66a706ae353879746b66d0f"} Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.854678 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-6d876c7c6f-jvj5b" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.889733 4775 scope.go:117] "RemoveContainer" containerID="9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.903702 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6695647446-72d6k"] Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.912375 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6695647446-72d6k"] Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.912889 4775 scope.go:117] "RemoveContainer" containerID="5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba" Jan 27 11:39:29 crc kubenswrapper[4775]: E0127 11:39:29.913432 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba\": container with ID starting with 5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba not found: ID does not exist" containerID="5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.913539 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba"} err="failed to get container status \"5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba\": rpc error: code = NotFound desc = could not find container \"5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba\": container with ID starting with 5428a6ea850aeb3d23de7230a0c035d655b09682a141c94924480daec086c9ba not found: ID does not exist" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.913576 4775 scope.go:117] "RemoveContainer" containerID="9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3" Jan 27 11:39:29 crc kubenswrapper[4775]: E0127 11:39:29.913990 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3\": container with ID starting with 9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3 not found: ID does not exist" containerID="9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.914192 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3"} err="failed to get container status \"9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3\": rpc error: code = NotFound desc = could not find container \"9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3\": container with ID starting with 9b3d3debee8aef92ac0d5ab8e20147ba798be255d822939933a6677a990a4eb3 not found: ID does not exist" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.914400 4775 scope.go:117] "RemoveContainer" containerID="7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.920887 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-6d876c7c6f-jvj5b"] Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.928043 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-6d876c7c6f-jvj5b"] Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.932973 4775 scope.go:117] "RemoveContainer" containerID="c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.956687 4775 scope.go:117] "RemoveContainer" containerID="7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f" Jan 27 11:39:29 crc kubenswrapper[4775]: E0127 11:39:29.957132 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f\": container with ID starting with 7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f not found: ID does not exist" containerID="7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.957178 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f"} err="failed to get container status \"7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f\": rpc error: code = NotFound desc = could not find container \"7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f\": container with ID starting with 7e5775a0f7bb760e2e3ea23521e42b57ba2d6fb9a3ea4bafd212b130c87d2a0f not found: ID does not exist" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.957198 4775 scope.go:117] "RemoveContainer" containerID="c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63" Jan 27 11:39:29 crc kubenswrapper[4775]: E0127 11:39:29.957616 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63\": container with ID starting with c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63 not found: ID does not exist" containerID="c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63" Jan 27 11:39:29 crc kubenswrapper[4775]: I0127 11:39:29.957635 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63"} err="failed to get container status \"c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63\": rpc error: code = NotFound desc = could not find container \"c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63\": container with ID starting with c6da089263cf0878e4e012a2f466ba54746e62815e31a8c24fe13c5e245cac63 not found: ID does not exist" Jan 27 11:39:31 crc kubenswrapper[4775]: I0127 11:39:31.757716 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31617f30-7431-401d-8c41-230d6a49ff72" path="/var/lib/kubelet/pods/31617f30-7431-401d-8c41-230d6a49ff72/volumes" Jan 27 11:39:31 crc kubenswrapper[4775]: I0127 11:39:31.758917 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" path="/var/lib/kubelet/pods/9862a859-ad75-4071-ad9a-ec926175e46d/volumes" Jan 27 11:39:33 crc kubenswrapper[4775]: E0127 11:39:33.238840 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:33 crc kubenswrapper[4775]: E0127 11:39:33.241407 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:33 crc kubenswrapper[4775]: E0127 11:39:33.242641 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:33 crc kubenswrapper[4775]: E0127 11:39:33.242772 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:34 crc kubenswrapper[4775]: I0127 11:39:34.925818 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5c59c678b7-lbtkp" Jan 27 11:39:34 crc kubenswrapper[4775]: I0127 11:39:34.986273 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f57cbf767-xvk7k"] Jan 27 11:39:34 crc kubenswrapper[4775]: I0127 11:39:34.987437 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f57cbf767-xvk7k" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-api" containerID="cri-o://0848da506d9d1e315e77e35c04fd69a834a63c3befc2e31f43e2dc6541968a23" gracePeriod=30 Jan 27 11:39:34 crc kubenswrapper[4775]: I0127 11:39:34.987625 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f57cbf767-xvk7k" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-httpd" containerID="cri-o://59aabef6148d4c27f5f6e5830e2db33d7bd3fb4d58f0d43a0d6775f307bccf5f" gracePeriod=30 Jan 27 11:39:35 crc kubenswrapper[4775]: I0127 11:39:35.914971 4775 generic.go:334] "Generic (PLEG): container finished" podID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerID="59aabef6148d4c27f5f6e5830e2db33d7bd3fb4d58f0d43a0d6775f307bccf5f" exitCode=0 Jan 27 11:39:35 crc kubenswrapper[4775]: I0127 11:39:35.915077 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f57cbf767-xvk7k" event={"ID":"17e205ad-6676-4f5d-b9d0-0d8c958d815d","Type":"ContainerDied","Data":"59aabef6148d4c27f5f6e5830e2db33d7bd3fb4d58f0d43a0d6775f307bccf5f"} Jan 27 11:39:36 crc kubenswrapper[4775]: I0127 11:39:36.925234 4775 generic.go:334] "Generic (PLEG): container finished" podID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerID="0848da506d9d1e315e77e35c04fd69a834a63c3befc2e31f43e2dc6541968a23" exitCode=0 Jan 27 11:39:36 crc kubenswrapper[4775]: I0127 11:39:36.925418 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f57cbf767-xvk7k" event={"ID":"17e205ad-6676-4f5d-b9d0-0d8c958d815d","Type":"ContainerDied","Data":"0848da506d9d1e315e77e35c04fd69a834a63c3befc2e31f43e2dc6541968a23"} Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.143355 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.274150 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-config\") pod \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.274248 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-ovndb-tls-certs\") pod \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.274320 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rp82\" (UniqueName: \"kubernetes.io/projected/17e205ad-6676-4f5d-b9d0-0d8c958d815d-kube-api-access-2rp82\") pod \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.274336 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-internal-tls-certs\") pod \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.274700 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-combined-ca-bundle\") pod \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.274840 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-public-tls-certs\") pod \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.274886 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-httpd-config\") pod \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\" (UID: \"17e205ad-6676-4f5d-b9d0-0d8c958d815d\") " Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.280247 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "17e205ad-6676-4f5d-b9d0-0d8c958d815d" (UID: "17e205ad-6676-4f5d-b9d0-0d8c958d815d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.282175 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17e205ad-6676-4f5d-b9d0-0d8c958d815d-kube-api-access-2rp82" (OuterVolumeSpecName: "kube-api-access-2rp82") pod "17e205ad-6676-4f5d-b9d0-0d8c958d815d" (UID: "17e205ad-6676-4f5d-b9d0-0d8c958d815d"). InnerVolumeSpecName "kube-api-access-2rp82". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.334623 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "17e205ad-6676-4f5d-b9d0-0d8c958d815d" (UID: "17e205ad-6676-4f5d-b9d0-0d8c958d815d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.334674 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "17e205ad-6676-4f5d-b9d0-0d8c958d815d" (UID: "17e205ad-6676-4f5d-b9d0-0d8c958d815d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.335511 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-config" (OuterVolumeSpecName: "config") pod "17e205ad-6676-4f5d-b9d0-0d8c958d815d" (UID: "17e205ad-6676-4f5d-b9d0-0d8c958d815d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.337439 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17e205ad-6676-4f5d-b9d0-0d8c958d815d" (UID: "17e205ad-6676-4f5d-b9d0-0d8c958d815d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.355182 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "17e205ad-6676-4f5d-b9d0-0d8c958d815d" (UID: "17e205ad-6676-4f5d-b9d0-0d8c958d815d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.376941 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.376982 4775 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.376995 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.377005 4775 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.377017 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rp82\" (UniqueName: \"kubernetes.io/projected/17e205ad-6676-4f5d-b9d0-0d8c958d815d-kube-api-access-2rp82\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.377031 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.377042 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17e205ad-6676-4f5d-b9d0-0d8c958d815d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.935482 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f57cbf767-xvk7k" event={"ID":"17e205ad-6676-4f5d-b9d0-0d8c958d815d","Type":"ContainerDied","Data":"53a128ffc6e310fa157dfd37a105cff396b2195c605357ef2976ef48f28caaf9"} Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.936903 4775 scope.go:117] "RemoveContainer" containerID="59aabef6148d4c27f5f6e5830e2db33d7bd3fb4d58f0d43a0d6775f307bccf5f" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.935719 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f57cbf767-xvk7k" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.968804 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f57cbf767-xvk7k"] Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.971532 4775 scope.go:117] "RemoveContainer" containerID="0848da506d9d1e315e77e35c04fd69a834a63c3befc2e31f43e2dc6541968a23" Jan 27 11:39:37 crc kubenswrapper[4775]: I0127 11:39:37.977675 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6f57cbf767-xvk7k"] Jan 27 11:39:38 crc kubenswrapper[4775]: E0127 11:39:38.240631 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:38 crc kubenswrapper[4775]: E0127 11:39:38.242943 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:38 crc kubenswrapper[4775]: E0127 11:39:38.244199 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:38 crc kubenswrapper[4775]: E0127 11:39:38.244239 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:39 crc kubenswrapper[4775]: I0127 11:39:39.756052 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" path="/var/lib/kubelet/pods/17e205ad-6676-4f5d-b9d0-0d8c958d815d/volumes" Jan 27 11:39:43 crc kubenswrapper[4775]: I0127 11:39:43.230947 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Jan 27 11:39:43 crc kubenswrapper[4775]: E0127 11:39:43.239908 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:43 crc kubenswrapper[4775]: E0127 11:39:43.241668 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:43 crc kubenswrapper[4775]: E0127 11:39:43.244353 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:39:43 crc kubenswrapper[4775]: E0127 11:39:43.244470 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:44 crc kubenswrapper[4775]: I0127 11:39:44.996568 4775 generic.go:334] "Generic (PLEG): container finished" podID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" exitCode=137 Jan 27 11:39:44 crc kubenswrapper[4775]: I0127 11:39:44.996808 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9","Type":"ContainerDied","Data":"cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6"} Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.112593 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.211763 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhrxz\" (UniqueName: \"kubernetes.io/projected/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-kube-api-access-mhrxz\") pod \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.211819 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-combined-ca-bundle\") pod \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.211848 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-config-data\") pod \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\" (UID: \"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9\") " Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.217540 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-kube-api-access-mhrxz" (OuterVolumeSpecName: "kube-api-access-mhrxz") pod "3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" (UID: "3a749d0b-2b5c-4025-87c6-bb4367b1ebe9"). InnerVolumeSpecName "kube-api-access-mhrxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.242055 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-config-data" (OuterVolumeSpecName: "config-data") pod "3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" (UID: "3a749d0b-2b5c-4025-87c6-bb4367b1ebe9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.246411 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" (UID: "3a749d0b-2b5c-4025-87c6-bb4367b1ebe9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.314060 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhrxz\" (UniqueName: \"kubernetes.io/projected/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-kube-api-access-mhrxz\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.314101 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:45 crc kubenswrapper[4775]: I0127 11:39:45.314116 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.006539 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3a749d0b-2b5c-4025-87c6-bb4367b1ebe9","Type":"ContainerDied","Data":"82d4d61311885172aa8b3e5cc80375eb709a13d1d92b08eb5c2530bda351308b"} Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.006588 4775 scope.go:117] "RemoveContainer" containerID="cbed1541ff49736092de6f3a8691fda76dcf2903666524bccfa208d76055adf6" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.006709 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.033351 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.047913 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065400 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:46 crc kubenswrapper[4775]: E0127 11:39:46.065784 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker-log" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065803 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker-log" Jan 27 11:39:46 crc kubenswrapper[4775]: E0127 11:39:46.065824 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065830 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener" Jan 27 11:39:46 crc kubenswrapper[4775]: E0127 11:39:46.065841 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener-log" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065847 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener-log" Jan 27 11:39:46 crc kubenswrapper[4775]: E0127 11:39:46.065859 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-api" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065864 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-api" Jan 27 11:39:46 crc kubenswrapper[4775]: E0127 11:39:46.065877 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-httpd" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065882 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-httpd" Jan 27 11:39:46 crc kubenswrapper[4775]: E0127 11:39:46.065899 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065905 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker" Jan 27 11:39:46 crc kubenswrapper[4775]: E0127 11:39:46.065920 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.065926 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066093 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066108 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" containerName="nova-cell0-conductor-conductor" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066120 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="31617f30-7431-401d-8c41-230d6a49ff72" containerName="barbican-worker-log" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066137 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066146 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="9862a859-ad75-4071-ad9a-ec926175e46d" containerName="barbican-keystone-listener-log" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066159 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-api" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066174 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="17e205ad-6676-4f5d-b9d0-0d8c958d815d" containerName="neutron-httpd" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.066717 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.076659 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.079332 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.079583 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-kp5gz" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.135867 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.135934 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.136395 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz4nd\" (UniqueName: \"kubernetes.io/projected/7617063e-fa32-45fc-b06e-7ecff629f7db-kube-api-access-cz4nd\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.237821 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.238224 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.238419 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz4nd\" (UniqueName: \"kubernetes.io/projected/7617063e-fa32-45fc-b06e-7ecff629f7db-kube-api-access-cz4nd\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.249198 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.252135 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.254941 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz4nd\" (UniqueName: \"kubernetes.io/projected/7617063e-fa32-45fc-b06e-7ecff629f7db-kube-api-access-cz4nd\") pod \"nova-cell0-conductor-0\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.393349 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.549606 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.553428 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-f49dbf586-l2cmp" Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.633889 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b9b59fc66-t6rbl"] Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.634408 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b9b59fc66-t6rbl" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-log" containerID="cri-o://c8e562dcd249e68b0060406f3b2394c8239c0b9654b1e64e4b6a4b3e8e23ca84" gracePeriod=30 Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.634877 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b9b59fc66-t6rbl" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-api" containerID="cri-o://174033676be0775ea3975296e01fba15ad5de44d5394f6325f82a1a3f89deda7" gracePeriod=30 Jan 27 11:39:46 crc kubenswrapper[4775]: I0127 11:39:46.878129 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:39:46 crc kubenswrapper[4775]: W0127 11:39:46.882683 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7617063e_fa32_45fc_b06e_7ecff629f7db.slice/crio-7366e2b06b8dfe620b743759b8a53259302cbfecadc69c376be4bc38237a72e8 WatchSource:0}: Error finding container 7366e2b06b8dfe620b743759b8a53259302cbfecadc69c376be4bc38237a72e8: Status 404 returned error can't find the container with id 7366e2b06b8dfe620b743759b8a53259302cbfecadc69c376be4bc38237a72e8 Jan 27 11:39:47 crc kubenswrapper[4775]: I0127 11:39:47.016550 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7617063e-fa32-45fc-b06e-7ecff629f7db","Type":"ContainerStarted","Data":"7366e2b06b8dfe620b743759b8a53259302cbfecadc69c376be4bc38237a72e8"} Jan 27 11:39:47 crc kubenswrapper[4775]: I0127 11:39:47.019025 4775 generic.go:334] "Generic (PLEG): container finished" podID="926c665f-b922-4372-85aa-bbe29399eaac" containerID="c8e562dcd249e68b0060406f3b2394c8239c0b9654b1e64e4b6a4b3e8e23ca84" exitCode=143 Jan 27 11:39:47 crc kubenswrapper[4775]: I0127 11:39:47.019095 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b9b59fc66-t6rbl" event={"ID":"926c665f-b922-4372-85aa-bbe29399eaac","Type":"ContainerDied","Data":"c8e562dcd249e68b0060406f3b2394c8239c0b9654b1e64e4b6a4b3e8e23ca84"} Jan 27 11:39:47 crc kubenswrapper[4775]: I0127 11:39:47.754599 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a749d0b-2b5c-4025-87c6-bb4367b1ebe9" path="/var/lib/kubelet/pods/3a749d0b-2b5c-4025-87c6-bb4367b1ebe9/volumes" Jan 27 11:39:48 crc kubenswrapper[4775]: I0127 11:39:48.029409 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7617063e-fa32-45fc-b06e-7ecff629f7db","Type":"ContainerStarted","Data":"7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7"} Jan 27 11:39:48 crc kubenswrapper[4775]: I0127 11:39:48.030603 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:48 crc kubenswrapper[4775]: I0127 11:39:48.055297 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.055275666 podStartE2EDuration="2.055275666s" podCreationTimestamp="2026-01-27 11:39:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:48.045464495 +0000 UTC m=+1167.187062282" watchObservedRunningTime="2026-01-27 11:39:48.055275666 +0000 UTC m=+1167.196873443" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.041985 4775 generic.go:334] "Generic (PLEG): container finished" podID="f3ab198a-6671-407e-931d-e1e6dc109197" containerID="87c0c670f987fb5b699e39f1152f819ebcf54f73b798b5259ff6a7b344f01fb9" exitCode=137 Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.042082 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerDied","Data":"87c0c670f987fb5b699e39f1152f819ebcf54f73b798b5259ff6a7b344f01fb9"} Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.315334 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.494168 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-sg-core-conf-yaml\") pod \"f3ab198a-6671-407e-931d-e1e6dc109197\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.494210 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-run-httpd\") pod \"f3ab198a-6671-407e-931d-e1e6dc109197\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.494267 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-combined-ca-bundle\") pod \"f3ab198a-6671-407e-931d-e1e6dc109197\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.494291 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-scripts\") pod \"f3ab198a-6671-407e-931d-e1e6dc109197\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.494338 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtljn\" (UniqueName: \"kubernetes.io/projected/f3ab198a-6671-407e-931d-e1e6dc109197-kube-api-access-rtljn\") pod \"f3ab198a-6671-407e-931d-e1e6dc109197\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.494405 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-config-data\") pod \"f3ab198a-6671-407e-931d-e1e6dc109197\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.494433 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-log-httpd\") pod \"f3ab198a-6671-407e-931d-e1e6dc109197\" (UID: \"f3ab198a-6671-407e-931d-e1e6dc109197\") " Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.495833 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f3ab198a-6671-407e-931d-e1e6dc109197" (UID: "f3ab198a-6671-407e-931d-e1e6dc109197"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.495956 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f3ab198a-6671-407e-931d-e1e6dc109197" (UID: "f3ab198a-6671-407e-931d-e1e6dc109197"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.500032 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3ab198a-6671-407e-931d-e1e6dc109197-kube-api-access-rtljn" (OuterVolumeSpecName: "kube-api-access-rtljn") pod "f3ab198a-6671-407e-931d-e1e6dc109197" (UID: "f3ab198a-6671-407e-931d-e1e6dc109197"). InnerVolumeSpecName "kube-api-access-rtljn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.500726 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-scripts" (OuterVolumeSpecName: "scripts") pod "f3ab198a-6671-407e-931d-e1e6dc109197" (UID: "f3ab198a-6671-407e-931d-e1e6dc109197"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.533400 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f3ab198a-6671-407e-931d-e1e6dc109197" (UID: "f3ab198a-6671-407e-931d-e1e6dc109197"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.577646 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3ab198a-6671-407e-931d-e1e6dc109197" (UID: "f3ab198a-6671-407e-931d-e1e6dc109197"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.598344 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.598389 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.598401 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f3ab198a-6671-407e-931d-e1e6dc109197-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.598411 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.598424 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.598433 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtljn\" (UniqueName: \"kubernetes.io/projected/f3ab198a-6671-407e-931d-e1e6dc109197-kube-api-access-rtljn\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.616484 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-config-data" (OuterVolumeSpecName: "config-data") pod "f3ab198a-6671-407e-931d-e1e6dc109197" (UID: "f3ab198a-6671-407e-931d-e1e6dc109197"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:49 crc kubenswrapper[4775]: I0127 11:39:49.701014 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3ab198a-6671-407e-931d-e1e6dc109197-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.055756 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f3ab198a-6671-407e-931d-e1e6dc109197","Type":"ContainerDied","Data":"2a2ee9ecd020ed63d838c367608617b5c5b9bef053fb9d27e529ac66f6e55c5a"} Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.056125 4775 scope.go:117] "RemoveContainer" containerID="87c0c670f987fb5b699e39f1152f819ebcf54f73b798b5259ff6a7b344f01fb9" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.055857 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.061758 4775 generic.go:334] "Generic (PLEG): container finished" podID="926c665f-b922-4372-85aa-bbe29399eaac" containerID="174033676be0775ea3975296e01fba15ad5de44d5394f6325f82a1a3f89deda7" exitCode=0 Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.061905 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b9b59fc66-t6rbl" event={"ID":"926c665f-b922-4372-85aa-bbe29399eaac","Type":"ContainerDied","Data":"174033676be0775ea3975296e01fba15ad5de44d5394f6325f82a1a3f89deda7"} Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.082921 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.088153 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109027 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109068 4775 scope.go:117] "RemoveContainer" containerID="b7f67772ea6767fe5e5ebb612038b7900a441fee4eef11de26a544a863c1564c" Jan 27 11:39:50 crc kubenswrapper[4775]: E0127 11:39:50.109538 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="sg-core" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109555 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="sg-core" Jan 27 11:39:50 crc kubenswrapper[4775]: E0127 11:39:50.109574 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-notification-agent" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109581 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-notification-agent" Jan 27 11:39:50 crc kubenswrapper[4775]: E0127 11:39:50.109606 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="proxy-httpd" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109614 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="proxy-httpd" Jan 27 11:39:50 crc kubenswrapper[4775]: E0127 11:39:50.109641 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-central-agent" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109649 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-central-agent" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109844 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-notification-agent" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109860 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="proxy-httpd" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109871 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="ceilometer-central-agent" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.109894 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" containerName="sg-core" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.111543 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.117850 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.147323 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.151118 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.193902 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.196233 4775 scope.go:117] "RemoveContainer" containerID="5c3d79aab2eaf39741cf0a1a88cf8bdc2458d431fe6b12dc6778f596671b970c" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.212394 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-scripts\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.212463 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.212502 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.212521 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-run-httpd\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.212621 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44vhv\" (UniqueName: \"kubernetes.io/projected/877bcef1-579c-413c-a0c0-6dad63885091-kube-api-access-44vhv\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.212716 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-config-data\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.212762 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-log-httpd\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.217238 4775 scope.go:117] "RemoveContainer" containerID="19dbb05fee4e0f091562b6f8390365f161f03f64f8035720d6e2c940618fe907" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314021 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-scripts\") pod \"926c665f-b922-4372-85aa-bbe29399eaac\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314082 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jspwx\" (UniqueName: \"kubernetes.io/projected/926c665f-b922-4372-85aa-bbe29399eaac-kube-api-access-jspwx\") pod \"926c665f-b922-4372-85aa-bbe29399eaac\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314116 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-internal-tls-certs\") pod \"926c665f-b922-4372-85aa-bbe29399eaac\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314152 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-config-data\") pod \"926c665f-b922-4372-85aa-bbe29399eaac\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314210 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-combined-ca-bundle\") pod \"926c665f-b922-4372-85aa-bbe29399eaac\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314239 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/926c665f-b922-4372-85aa-bbe29399eaac-logs\") pod \"926c665f-b922-4372-85aa-bbe29399eaac\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314263 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-public-tls-certs\") pod \"926c665f-b922-4372-85aa-bbe29399eaac\" (UID: \"926c665f-b922-4372-85aa-bbe29399eaac\") " Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314440 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-config-data\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314494 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-log-httpd\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314542 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-scripts\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314566 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314593 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314611 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-run-httpd\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.314646 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44vhv\" (UniqueName: \"kubernetes.io/projected/877bcef1-579c-413c-a0c0-6dad63885091-kube-api-access-44vhv\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.315470 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/926c665f-b922-4372-85aa-bbe29399eaac-logs" (OuterVolumeSpecName: "logs") pod "926c665f-b922-4372-85aa-bbe29399eaac" (UID: "926c665f-b922-4372-85aa-bbe29399eaac"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.316124 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-log-httpd\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.317136 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-run-httpd\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.319634 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/926c665f-b922-4372-85aa-bbe29399eaac-kube-api-access-jspwx" (OuterVolumeSpecName: "kube-api-access-jspwx") pod "926c665f-b922-4372-85aa-bbe29399eaac" (UID: "926c665f-b922-4372-85aa-bbe29399eaac"). InnerVolumeSpecName "kube-api-access-jspwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.321023 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-scripts" (OuterVolumeSpecName: "scripts") pod "926c665f-b922-4372-85aa-bbe29399eaac" (UID: "926c665f-b922-4372-85aa-bbe29399eaac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.321871 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.323595 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-config-data\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.324234 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.327524 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-scripts\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.333650 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44vhv\" (UniqueName: \"kubernetes.io/projected/877bcef1-579c-413c-a0c0-6dad63885091-kube-api-access-44vhv\") pod \"ceilometer-0\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.378561 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-config-data" (OuterVolumeSpecName: "config-data") pod "926c665f-b922-4372-85aa-bbe29399eaac" (UID: "926c665f-b922-4372-85aa-bbe29399eaac"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.379198 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "926c665f-b922-4372-85aa-bbe29399eaac" (UID: "926c665f-b922-4372-85aa-bbe29399eaac"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.415440 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.415489 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/926c665f-b922-4372-85aa-bbe29399eaac-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.415499 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.415511 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jspwx\" (UniqueName: \"kubernetes.io/projected/926c665f-b922-4372-85aa-bbe29399eaac-kube-api-access-jspwx\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.415523 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.416635 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "926c665f-b922-4372-85aa-bbe29399eaac" (UID: "926c665f-b922-4372-85aa-bbe29399eaac"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.422425 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "926c665f-b922-4372-85aa-bbe29399eaac" (UID: "926c665f-b922-4372-85aa-bbe29399eaac"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.506601 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.517856 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.517895 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/926c665f-b922-4372-85aa-bbe29399eaac-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:39:50 crc kubenswrapper[4775]: I0127 11:39:50.926321 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:39:50 crc kubenswrapper[4775]: W0127 11:39:50.929035 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod877bcef1_579c_413c_a0c0_6dad63885091.slice/crio-eeb5d6eb3865672e5d710d66ff273bcee9e0b5353cef376cf3d7740ea7501229 WatchSource:0}: Error finding container eeb5d6eb3865672e5d710d66ff273bcee9e0b5353cef376cf3d7740ea7501229: Status 404 returned error can't find the container with id eeb5d6eb3865672e5d710d66ff273bcee9e0b5353cef376cf3d7740ea7501229 Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.073180 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b9b59fc66-t6rbl" Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.073233 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b9b59fc66-t6rbl" event={"ID":"926c665f-b922-4372-85aa-bbe29399eaac","Type":"ContainerDied","Data":"0fb58f98d42cc735e9a9f8ee52d9b3e8b27d110f1502a0148df7a0c3e74615b7"} Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.073297 4775 scope.go:117] "RemoveContainer" containerID="174033676be0775ea3975296e01fba15ad5de44d5394f6325f82a1a3f89deda7" Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.074302 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerStarted","Data":"eeb5d6eb3865672e5d710d66ff273bcee9e0b5353cef376cf3d7740ea7501229"} Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.112086 4775 scope.go:117] "RemoveContainer" containerID="c8e562dcd249e68b0060406f3b2394c8239c0b9654b1e64e4b6a4b3e8e23ca84" Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.112234 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b9b59fc66-t6rbl"] Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.120390 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6b9b59fc66-t6rbl"] Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.788728 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="926c665f-b922-4372-85aa-bbe29399eaac" path="/var/lib/kubelet/pods/926c665f-b922-4372-85aa-bbe29399eaac/volumes" Jan 27 11:39:51 crc kubenswrapper[4775]: I0127 11:39:51.790140 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3ab198a-6671-407e-931d-e1e6dc109197" path="/var/lib/kubelet/pods/f3ab198a-6671-407e-931d-e1e6dc109197/volumes" Jan 27 11:39:52 crc kubenswrapper[4775]: I0127 11:39:52.083643 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerStarted","Data":"3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2"} Jan 27 11:39:53 crc kubenswrapper[4775]: I0127 11:39:53.108781 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerStarted","Data":"43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005"} Jan 27 11:39:54 crc kubenswrapper[4775]: I0127 11:39:54.120333 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerStarted","Data":"8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d"} Jan 27 11:39:56 crc kubenswrapper[4775]: I0127 11:39:56.142071 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerStarted","Data":"eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261"} Jan 27 11:39:56 crc kubenswrapper[4775]: I0127 11:39:56.142895 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:39:56 crc kubenswrapper[4775]: I0127 11:39:56.170117 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.075568595 podStartE2EDuration="6.170095728s" podCreationTimestamp="2026-01-27 11:39:50 +0000 UTC" firstStartedPulling="2026-01-27 11:39:50.930959871 +0000 UTC m=+1170.072557648" lastFinishedPulling="2026-01-27 11:39:55.025487004 +0000 UTC m=+1174.167084781" observedRunningTime="2026-01-27 11:39:56.168322928 +0000 UTC m=+1175.309920705" watchObservedRunningTime="2026-01-27 11:39:56.170095728 +0000 UTC m=+1175.311693515" Jan 27 11:39:56 crc kubenswrapper[4775]: I0127 11:39:56.421631 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.121871 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-m2t9b"] Jan 27 11:39:57 crc kubenswrapper[4775]: E0127 11:39:57.122679 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-log" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.122701 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-log" Jan 27 11:39:57 crc kubenswrapper[4775]: E0127 11:39:57.122726 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-api" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.122737 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-api" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.122980 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-log" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.123004 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="926c665f-b922-4372-85aa-bbe29399eaac" containerName="placement-api" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.123763 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.129219 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.129875 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.172797 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-m2t9b"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.241373 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.241459 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-config-data\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.241535 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m26fc\" (UniqueName: \"kubernetes.io/projected/8726531a-a74e-48cd-a274-6f67ae507560-kube-api-access-m26fc\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.241625 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-scripts\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.287185 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.288677 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.294481 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.330169 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.342887 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-scripts\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.342933 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zjnb\" (UniqueName: \"kubernetes.io/projected/d8f6cebd-0ba7-4713-906a-f48b094c332b-kube-api-access-8zjnb\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.342985 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.343010 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.343044 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-config-data\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.343078 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m26fc\" (UniqueName: \"kubernetes.io/projected/8726531a-a74e-48cd-a274-6f67ae507560-kube-api-access-m26fc\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.343139 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-config-data\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.343154 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8f6cebd-0ba7-4713-906a-f48b094c332b-logs\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.351080 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.351819 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-scripts\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.358254 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-config-data\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.364413 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.365527 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.370409 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.370963 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m26fc\" (UniqueName: \"kubernetes.io/projected/8726531a-a74e-48cd-a274-6f67ae507560-kube-api-access-m26fc\") pod \"nova-cell0-cell-mapping-m2t9b\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.385492 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448311 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-config-data\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448363 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8f6cebd-0ba7-4713-906a-f48b094c332b-logs\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448402 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zjnb\" (UniqueName: \"kubernetes.io/projected/d8f6cebd-0ba7-4713-906a-f48b094c332b-kube-api-access-8zjnb\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448489 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448540 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448566 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448660 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmdhc\" (UniqueName: \"kubernetes.io/projected/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-kube-api-access-nmdhc\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.448870 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8f6cebd-0ba7-4713-906a-f48b094c332b-logs\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.453532 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-config-data\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.460306 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.467742 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.469164 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.483055 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.484253 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.518059 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.528196 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zjnb\" (UniqueName: \"kubernetes.io/projected/d8f6cebd-0ba7-4713-906a-f48b094c332b-kube-api-access-8zjnb\") pod \"nova-api-0\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.552512 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c27adc3b-07b5-457f-96f5-cfffea2e34b8-logs\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.552585 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.552606 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.552690 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-config-data\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.552715 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmdhc\" (UniqueName: \"kubernetes.io/projected/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-kube-api-access-nmdhc\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.552750 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.552777 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95v4f\" (UniqueName: \"kubernetes.io/projected/c27adc3b-07b5-457f-96f5-cfffea2e34b8-kube-api-access-95v4f\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.572127 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.577407 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.593195 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmdhc\" (UniqueName: \"kubernetes.io/projected/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-kube-api-access-nmdhc\") pod \"nova-cell1-novncproxy-0\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.602780 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.603986 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.608098 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.608244 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.610223 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.621536 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9zwtc"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.626071 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.628219 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9zwtc"] Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.654662 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-config-data\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.654733 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.654775 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95v4f\" (UniqueName: \"kubernetes.io/projected/c27adc3b-07b5-457f-96f5-cfffea2e34b8-kube-api-access-95v4f\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.654799 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.654817 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc8nj\" (UniqueName: \"kubernetes.io/projected/d0c2fced-9c0a-4cef-90ed-d6429ee82751-kube-api-access-dc8nj\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.654844 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c27adc3b-07b5-457f-96f5-cfffea2e34b8-logs\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.654906 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-config-data\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.657322 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c27adc3b-07b5-457f-96f5-cfffea2e34b8-logs\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.662914 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-config-data\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.673068 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.681072 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95v4f\" (UniqueName: \"kubernetes.io/projected/c27adc3b-07b5-457f-96f5-cfffea2e34b8-kube-api-access-95v4f\") pod \"nova-metadata-0\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756591 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756668 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756690 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc8nj\" (UniqueName: \"kubernetes.io/projected/d0c2fced-9c0a-4cef-90ed-d6429ee82751-kube-api-access-dc8nj\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756716 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw444\" (UniqueName: \"kubernetes.io/projected/e869df3d-c15b-4610-bb78-00ad49940d17-kube-api-access-qw444\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756735 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756768 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-config\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756790 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-config-data\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756824 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.756851 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.763661 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.766406 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-config-data\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.788522 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc8nj\" (UniqueName: \"kubernetes.io/projected/d0c2fced-9c0a-4cef-90ed-d6429ee82751-kube-api-access-dc8nj\") pod \"nova-scheduler-0\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.861046 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.861108 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.861192 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.861287 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw444\" (UniqueName: \"kubernetes.io/projected/e869df3d-c15b-4610-bb78-00ad49940d17-kube-api-access-qw444\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.861324 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.861382 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-config\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.862074 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-nb\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.862171 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-config\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.864259 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-swift-storage-0\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.864791 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-svc\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.867316 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-sb\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.884015 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.888412 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw444\" (UniqueName: \"kubernetes.io/projected/e869df3d-c15b-4610-bb78-00ad49940d17-kube-api-access-qw444\") pod \"dnsmasq-dns-557bbc7df7-9zwtc\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.911893 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.944559 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:39:57 crc kubenswrapper[4775]: I0127 11:39:57.957833 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.145157 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-m2t9b"] Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.179721 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m2t9b" event={"ID":"8726531a-a74e-48cd-a274-6f67ae507560","Type":"ContainerStarted","Data":"50506c1daff83db792e938bca2854ddaabea300dce60ee38949fcee7261dbf7c"} Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.248538 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xh4b2"] Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.250308 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.253993 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.254192 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.303322 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xh4b2"] Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.342205 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.378423 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-config-data\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.378748 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-scripts\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.379055 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndwhl\" (UniqueName: \"kubernetes.io/projected/a3942760-c6b4-43b5-9680-48d8b8ae3854-kube-api-access-ndwhl\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.379170 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.453007 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.481402 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndwhl\" (UniqueName: \"kubernetes.io/projected/a3942760-c6b4-43b5-9680-48d8b8ae3854-kube-api-access-ndwhl\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.481953 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.482058 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-config-data\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.482153 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-scripts\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.487643 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-scripts\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.489704 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-config-data\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.490332 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.499141 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndwhl\" (UniqueName: \"kubernetes.io/projected/a3942760-c6b4-43b5-9680-48d8b8ae3854-kube-api-access-ndwhl\") pod \"nova-cell1-conductor-db-sync-xh4b2\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.616099 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.632509 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.712297 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9zwtc"] Jan 27 11:39:58 crc kubenswrapper[4775]: W0127 11:39:58.726826 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode869df3d_c15b_4610_bb78_00ad49940d17.slice/crio-52b60424b8cf5ff2ff4e842e6b10a8198ed82ddf2c9b7ee1ba530ecf8959634e WatchSource:0}: Error finding container 52b60424b8cf5ff2ff4e842e6b10a8198ed82ddf2c9b7ee1ba530ecf8959634e: Status 404 returned error can't find the container with id 52b60424b8cf5ff2ff4e842e6b10a8198ed82ddf2c9b7ee1ba530ecf8959634e Jan 27 11:39:58 crc kubenswrapper[4775]: I0127 11:39:58.735779 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.191131 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8f6cebd-0ba7-4713-906a-f48b094c332b","Type":"ContainerStarted","Data":"234a0c458af079345c8244f18b5988ff4056e6dc102c0dcac07740fc8e5eeb55"} Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.200318 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8","Type":"ContainerStarted","Data":"ed92b0535d28c0e558eeedf3ab4bfde4b43bbb5e6bbdcef58e08c4e58984f177"} Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.203800 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" event={"ID":"e869df3d-c15b-4610-bb78-00ad49940d17","Type":"ContainerStarted","Data":"0a7460a95945a93f0c4a50f297f4b7fe68e0f3ea9e0d32b93ec9b5db49741c68"} Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.203864 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" event={"ID":"e869df3d-c15b-4610-bb78-00ad49940d17","Type":"ContainerStarted","Data":"52b60424b8cf5ff2ff4e842e6b10a8198ed82ddf2c9b7ee1ba530ecf8959634e"} Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.212546 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c27adc3b-07b5-457f-96f5-cfffea2e34b8","Type":"ContainerStarted","Data":"589962a3449aa5fdeff7c986358683ffe6f8f1614193eefd4398944d9fb0173e"} Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.229516 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m2t9b" event={"ID":"8726531a-a74e-48cd-a274-6f67ae507560","Type":"ContainerStarted","Data":"7a301f6fdbdbc7fba26fdec2032cb9599d38e17acf3b3627d4e654dc3bc0fdb7"} Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.244375 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0c2fced-9c0a-4cef-90ed-d6429ee82751","Type":"ContainerStarted","Data":"6cc0b4e0ee3ee3d1c37a3c99f72cc692ec43b7950501d86f96aae8aad30b9516"} Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.253565 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xh4b2"] Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.253823 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-m2t9b" podStartSLOduration=2.253808133 podStartE2EDuration="2.253808133s" podCreationTimestamp="2026-01-27 11:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:39:59.244716361 +0000 UTC m=+1178.386314138" watchObservedRunningTime="2026-01-27 11:39:59.253808133 +0000 UTC m=+1178.395405910" Jan 27 11:39:59 crc kubenswrapper[4775]: W0127 11:39:59.254723 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3942760_c6b4_43b5_9680_48d8b8ae3854.slice/crio-3875cdbec0530e620f02f1a307c1a67dee8feb8dbd26d732a25f2b80b3893386 WatchSource:0}: Error finding container 3875cdbec0530e620f02f1a307c1a67dee8feb8dbd26d732a25f2b80b3893386: Status 404 returned error can't find the container with id 3875cdbec0530e620f02f1a307c1a67dee8feb8dbd26d732a25f2b80b3893386 Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.517502 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.517556 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.517597 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.518249 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"26ce088382cdfd012bc2388482c813f595be3264b04c0cc4340c1bcb667afde7"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:39:59 crc kubenswrapper[4775]: I0127 11:39:59.518300 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://26ce088382cdfd012bc2388482c813f595be3264b04c0cc4340c1bcb667afde7" gracePeriod=600 Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.271393 4775 generic.go:334] "Generic (PLEG): container finished" podID="e869df3d-c15b-4610-bb78-00ad49940d17" containerID="0a7460a95945a93f0c4a50f297f4b7fe68e0f3ea9e0d32b93ec9b5db49741c68" exitCode=0 Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.272102 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" event={"ID":"e869df3d-c15b-4610-bb78-00ad49940d17","Type":"ContainerDied","Data":"0a7460a95945a93f0c4a50f297f4b7fe68e0f3ea9e0d32b93ec9b5db49741c68"} Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.272154 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" event={"ID":"e869df3d-c15b-4610-bb78-00ad49940d17","Type":"ContainerStarted","Data":"ada66549c4f1e296080bb921b685b5ff52027670033c232a5715f71a31d45760"} Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.272256 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.276666 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="26ce088382cdfd012bc2388482c813f595be3264b04c0cc4340c1bcb667afde7" exitCode=0 Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.276762 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"26ce088382cdfd012bc2388482c813f595be3264b04c0cc4340c1bcb667afde7"} Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.276846 4775 scope.go:117] "RemoveContainer" containerID="d3e646652935035e4ff54edd9c0e89ba4aba219ed8931315dc5dc4069b80f310" Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.279274 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" event={"ID":"a3942760-c6b4-43b5-9680-48d8b8ae3854","Type":"ContainerStarted","Data":"b754699b4de85074b5e141a6f2ae8704aa4f96f92dca88cac7a93ee7f041781e"} Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.279313 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" event={"ID":"a3942760-c6b4-43b5-9680-48d8b8ae3854","Type":"ContainerStarted","Data":"3875cdbec0530e620f02f1a307c1a67dee8feb8dbd26d732a25f2b80b3893386"} Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.309385 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" podStartSLOduration=3.30936532 podStartE2EDuration="3.30936532s" podCreationTimestamp="2026-01-27 11:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:00.292149434 +0000 UTC m=+1179.433747231" watchObservedRunningTime="2026-01-27 11:40:00.30936532 +0000 UTC m=+1179.450963087" Jan 27 11:40:00 crc kubenswrapper[4775]: I0127 11:40:00.319121 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" podStartSLOduration=2.318897084 podStartE2EDuration="2.318897084s" podCreationTimestamp="2026-01-27 11:39:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:00.310290166 +0000 UTC m=+1179.451887943" watchObservedRunningTime="2026-01-27 11:40:00.318897084 +0000 UTC m=+1179.460494851" Jan 27 11:40:01 crc kubenswrapper[4775]: I0127 11:40:01.461512 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:01 crc kubenswrapper[4775]: I0127 11:40:01.472655 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.301818 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8f6cebd-0ba7-4713-906a-f48b094c332b","Type":"ContainerStarted","Data":"a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d"} Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.302398 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8f6cebd-0ba7-4713-906a-f48b094c332b","Type":"ContainerStarted","Data":"12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8"} Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.303790 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8","Type":"ContainerStarted","Data":"5f66195a27d4424e7e63c73f2e82e91d3646c082443a037a0bda03b3cefa73cf"} Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.303919 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://5f66195a27d4424e7e63c73f2e82e91d3646c082443a037a0bda03b3cefa73cf" gracePeriod=30 Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.313865 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c27adc3b-07b5-457f-96f5-cfffea2e34b8","Type":"ContainerStarted","Data":"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05"} Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.313915 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c27adc3b-07b5-457f-96f5-cfffea2e34b8","Type":"ContainerStarted","Data":"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d"} Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.314034 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-log" containerID="cri-o://6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d" gracePeriod=30 Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.314174 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-metadata" containerID="cri-o://e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05" gracePeriod=30 Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.319194 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"cbdf6a049623d9cb774c7274e1659534afc097c8aad51e3cfeb95dc0922d2c51"} Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.321441 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0c2fced-9c0a-4cef-90ed-d6429ee82751","Type":"ContainerStarted","Data":"c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d"} Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.374639 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.360300912 podStartE2EDuration="5.374617215s" podCreationTimestamp="2026-01-27 11:39:57 +0000 UTC" firstStartedPulling="2026-01-27 11:39:58.312123629 +0000 UTC m=+1177.453721406" lastFinishedPulling="2026-01-27 11:40:01.326439932 +0000 UTC m=+1180.468037709" observedRunningTime="2026-01-27 11:40:02.333026993 +0000 UTC m=+1181.474624770" watchObservedRunningTime="2026-01-27 11:40:02.374617215 +0000 UTC m=+1181.516214992" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.375182 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.515363535 podStartE2EDuration="5.37517696s" podCreationTimestamp="2026-01-27 11:39:57 +0000 UTC" firstStartedPulling="2026-01-27 11:39:58.46349171 +0000 UTC m=+1177.605089487" lastFinishedPulling="2026-01-27 11:40:01.323305135 +0000 UTC m=+1180.464902912" observedRunningTime="2026-01-27 11:40:02.36794337 +0000 UTC m=+1181.509541147" watchObservedRunningTime="2026-01-27 11:40:02.37517696 +0000 UTC m=+1181.516774737" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.401983 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.690785042 podStartE2EDuration="5.401970423s" podCreationTimestamp="2026-01-27 11:39:57 +0000 UTC" firstStartedPulling="2026-01-27 11:39:58.621169795 +0000 UTC m=+1177.762767572" lastFinishedPulling="2026-01-27 11:40:01.332355176 +0000 UTC m=+1180.473952953" observedRunningTime="2026-01-27 11:40:02.400614036 +0000 UTC m=+1181.542211823" watchObservedRunningTime="2026-01-27 11:40:02.401970423 +0000 UTC m=+1181.543568200" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.431581 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.896297942 podStartE2EDuration="5.431559492s" podCreationTimestamp="2026-01-27 11:39:57 +0000 UTC" firstStartedPulling="2026-01-27 11:39:58.798591308 +0000 UTC m=+1177.940189085" lastFinishedPulling="2026-01-27 11:40:01.333852858 +0000 UTC m=+1180.475450635" observedRunningTime="2026-01-27 11:40:02.426839751 +0000 UTC m=+1181.568437528" watchObservedRunningTime="2026-01-27 11:40:02.431559492 +0000 UTC m=+1181.573157269" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.882964 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.885087 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.945740 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.982626 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c27adc3b-07b5-457f-96f5-cfffea2e34b8-logs\") pod \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.982740 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-combined-ca-bundle\") pod \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.982778 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95v4f\" (UniqueName: \"kubernetes.io/projected/c27adc3b-07b5-457f-96f5-cfffea2e34b8-kube-api-access-95v4f\") pod \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.982820 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-config-data\") pod \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\" (UID: \"c27adc3b-07b5-457f-96f5-cfffea2e34b8\") " Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.983044 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c27adc3b-07b5-457f-96f5-cfffea2e34b8-logs" (OuterVolumeSpecName: "logs") pod "c27adc3b-07b5-457f-96f5-cfffea2e34b8" (UID: "c27adc3b-07b5-457f-96f5-cfffea2e34b8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.983267 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c27adc3b-07b5-457f-96f5-cfffea2e34b8-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:02 crc kubenswrapper[4775]: I0127 11:40:02.988987 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c27adc3b-07b5-457f-96f5-cfffea2e34b8-kube-api-access-95v4f" (OuterVolumeSpecName: "kube-api-access-95v4f") pod "c27adc3b-07b5-457f-96f5-cfffea2e34b8" (UID: "c27adc3b-07b5-457f-96f5-cfffea2e34b8"). InnerVolumeSpecName "kube-api-access-95v4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.012722 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-config-data" (OuterVolumeSpecName: "config-data") pod "c27adc3b-07b5-457f-96f5-cfffea2e34b8" (UID: "c27adc3b-07b5-457f-96f5-cfffea2e34b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.015621 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c27adc3b-07b5-457f-96f5-cfffea2e34b8" (UID: "c27adc3b-07b5-457f-96f5-cfffea2e34b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.084546 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.084857 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95v4f\" (UniqueName: \"kubernetes.io/projected/c27adc3b-07b5-457f-96f5-cfffea2e34b8-kube-api-access-95v4f\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.084867 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c27adc3b-07b5-457f-96f5-cfffea2e34b8-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.330522 4775 generic.go:334] "Generic (PLEG): container finished" podID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerID="e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05" exitCode=0 Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.330568 4775 generic.go:334] "Generic (PLEG): container finished" podID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerID="6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d" exitCode=143 Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.330580 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.330628 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c27adc3b-07b5-457f-96f5-cfffea2e34b8","Type":"ContainerDied","Data":"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05"} Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.330689 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c27adc3b-07b5-457f-96f5-cfffea2e34b8","Type":"ContainerDied","Data":"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d"} Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.330741 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c27adc3b-07b5-457f-96f5-cfffea2e34b8","Type":"ContainerDied","Data":"589962a3449aa5fdeff7c986358683ffe6f8f1614193eefd4398944d9fb0173e"} Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.330761 4775 scope.go:117] "RemoveContainer" containerID="e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.415170 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.419888 4775 scope.go:117] "RemoveContainer" containerID="6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.439481 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.464303 4775 scope.go:117] "RemoveContainer" containerID="e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05" Jan 27 11:40:03 crc kubenswrapper[4775]: E0127 11:40:03.464804 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05\": container with ID starting with e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05 not found: ID does not exist" containerID="e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.464866 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05"} err="failed to get container status \"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05\": rpc error: code = NotFound desc = could not find container \"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05\": container with ID starting with e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05 not found: ID does not exist" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.464896 4775 scope.go:117] "RemoveContainer" containerID="6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d" Jan 27 11:40:03 crc kubenswrapper[4775]: E0127 11:40:03.465181 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d\": container with ID starting with 6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d not found: ID does not exist" containerID="6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.465202 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d"} err="failed to get container status \"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d\": rpc error: code = NotFound desc = could not find container \"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d\": container with ID starting with 6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d not found: ID does not exist" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.465216 4775 scope.go:117] "RemoveContainer" containerID="e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.465425 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05"} err="failed to get container status \"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05\": rpc error: code = NotFound desc = could not find container \"e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05\": container with ID starting with e5fc0a06188821679305fdd066d0eca058d10131105c178a2acaa3f6fca0bd05 not found: ID does not exist" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.465459 4775 scope.go:117] "RemoveContainer" containerID="6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.466823 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d"} err="failed to get container status \"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d\": rpc error: code = NotFound desc = could not find container \"6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d\": container with ID starting with 6071608b7572f938f54f18be07c9cb42786f3c1aa3781de31dc18fe00224846d not found: ID does not exist" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.468403 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:03 crc kubenswrapper[4775]: E0127 11:40:03.468855 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-log" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.468872 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-log" Jan 27 11:40:03 crc kubenswrapper[4775]: E0127 11:40:03.468902 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-metadata" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.468908 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-metadata" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.469179 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-log" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.469199 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" containerName="nova-metadata-metadata" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.470710 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.476285 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.476519 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.479994 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.595649 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxq6w\" (UniqueName: \"kubernetes.io/projected/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-kube-api-access-vxq6w\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.596058 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-logs\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.596092 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-config-data\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.596179 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.596219 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.698362 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxq6w\" (UniqueName: \"kubernetes.io/projected/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-kube-api-access-vxq6w\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.698475 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-logs\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.698519 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-config-data\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.698603 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.698640 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.699966 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-logs\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.705242 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.705556 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-config-data\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.707139 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.722953 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxq6w\" (UniqueName: \"kubernetes.io/projected/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-kube-api-access-vxq6w\") pod \"nova-metadata-0\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " pod="openstack/nova-metadata-0" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.760863 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c27adc3b-07b5-457f-96f5-cfffea2e34b8" path="/var/lib/kubelet/pods/c27adc3b-07b5-457f-96f5-cfffea2e34b8/volumes" Jan 27 11:40:03 crc kubenswrapper[4775]: I0127 11:40:03.791046 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:04 crc kubenswrapper[4775]: I0127 11:40:04.305888 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:04 crc kubenswrapper[4775]: I0127 11:40:04.340360 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a0f23e6-5732-4337-b5fa-d433e99f5cb1","Type":"ContainerStarted","Data":"7a575331b653caf550f94723cfa812dc6bcd3d3fd0557ee1db29564218b576be"} Jan 27 11:40:05 crc kubenswrapper[4775]: I0127 11:40:05.350366 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a0f23e6-5732-4337-b5fa-d433e99f5cb1","Type":"ContainerStarted","Data":"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33"} Jan 27 11:40:05 crc kubenswrapper[4775]: I0127 11:40:05.350908 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a0f23e6-5732-4337-b5fa-d433e99f5cb1","Type":"ContainerStarted","Data":"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114"} Jan 27 11:40:05 crc kubenswrapper[4775]: I0127 11:40:05.374595 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.374581562 podStartE2EDuration="2.374581562s" podCreationTimestamp="2026-01-27 11:40:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:05.37161864 +0000 UTC m=+1184.513216417" watchObservedRunningTime="2026-01-27 11:40:05.374581562 +0000 UTC m=+1184.516179339" Jan 27 11:40:06 crc kubenswrapper[4775]: I0127 11:40:06.359854 4775 generic.go:334] "Generic (PLEG): container finished" podID="8726531a-a74e-48cd-a274-6f67ae507560" containerID="7a301f6fdbdbc7fba26fdec2032cb9599d38e17acf3b3627d4e654dc3bc0fdb7" exitCode=0 Jan 27 11:40:06 crc kubenswrapper[4775]: I0127 11:40:06.359914 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m2t9b" event={"ID":"8726531a-a74e-48cd-a274-6f67ae507560","Type":"ContainerDied","Data":"7a301f6fdbdbc7fba26fdec2032cb9599d38e17acf3b3627d4e654dc3bc0fdb7"} Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.370531 4775 generic.go:334] "Generic (PLEG): container finished" podID="a3942760-c6b4-43b5-9680-48d8b8ae3854" containerID="b754699b4de85074b5e141a6f2ae8704aa4f96f92dca88cac7a93ee7f041781e" exitCode=0 Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.370643 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" event={"ID":"a3942760-c6b4-43b5-9680-48d8b8ae3854","Type":"ContainerDied","Data":"b754699b4de85074b5e141a6f2ae8704aa4f96f92dca88cac7a93ee7f041781e"} Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.609410 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.609491 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.749240 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.880150 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-config-data\") pod \"8726531a-a74e-48cd-a274-6f67ae507560\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.880277 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-scripts\") pod \"8726531a-a74e-48cd-a274-6f67ae507560\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.880328 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m26fc\" (UniqueName: \"kubernetes.io/projected/8726531a-a74e-48cd-a274-6f67ae507560-kube-api-access-m26fc\") pod \"8726531a-a74e-48cd-a274-6f67ae507560\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.880553 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-combined-ca-bundle\") pod \"8726531a-a74e-48cd-a274-6f67ae507560\" (UID: \"8726531a-a74e-48cd-a274-6f67ae507560\") " Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.886719 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-scripts" (OuterVolumeSpecName: "scripts") pod "8726531a-a74e-48cd-a274-6f67ae507560" (UID: "8726531a-a74e-48cd-a274-6f67ae507560"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.888165 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8726531a-a74e-48cd-a274-6f67ae507560-kube-api-access-m26fc" (OuterVolumeSpecName: "kube-api-access-m26fc") pod "8726531a-a74e-48cd-a274-6f67ae507560" (UID: "8726531a-a74e-48cd-a274-6f67ae507560"). InnerVolumeSpecName "kube-api-access-m26fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.913294 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-config-data" (OuterVolumeSpecName: "config-data") pod "8726531a-a74e-48cd-a274-6f67ae507560" (UID: "8726531a-a74e-48cd-a274-6f67ae507560"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.917920 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8726531a-a74e-48cd-a274-6f67ae507560" (UID: "8726531a-a74e-48cd-a274-6f67ae507560"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.945273 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.960373 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.980436 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.983022 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.983057 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m26fc\" (UniqueName: \"kubernetes.io/projected/8726531a-a74e-48cd-a274-6f67ae507560-kube-api-access-m26fc\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.983068 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:07 crc kubenswrapper[4775]: I0127 11:40:07.983077 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8726531a-a74e-48cd-a274-6f67ae507560-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.080653 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-2kvdd"] Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.080979 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" podUID="91668934-529e-4df9-b41f-8cd54e5920ea" containerName="dnsmasq-dns" containerID="cri-o://a0e92df054ede73072c8816014c71d3028937fc797e7a11e419afbd459f2f615" gracePeriod=10 Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.385706 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-m2t9b" event={"ID":"8726531a-a74e-48cd-a274-6f67ae507560","Type":"ContainerDied","Data":"50506c1daff83db792e938bca2854ddaabea300dce60ee38949fcee7261dbf7c"} Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.385754 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50506c1daff83db792e938bca2854ddaabea300dce60ee38949fcee7261dbf7c" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.385829 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-m2t9b" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.394026 4775 generic.go:334] "Generic (PLEG): container finished" podID="91668934-529e-4df9-b41f-8cd54e5920ea" containerID="a0e92df054ede73072c8816014c71d3028937fc797e7a11e419afbd459f2f615" exitCode=0 Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.395030 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" event={"ID":"91668934-529e-4df9-b41f-8cd54e5920ea","Type":"ContainerDied","Data":"a0e92df054ede73072c8816014c71d3028937fc797e7a11e419afbd459f2f615"} Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.448666 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.551251 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.609709 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.623671 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.624050 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-log" containerID="cri-o://12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8" gracePeriod=30 Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.624608 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-api" containerID="cri-o://a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d" gracePeriod=30 Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.653258 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.663607 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.663806 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-log" containerID="cri-o://b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114" gracePeriod=30 Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.664222 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-metadata" containerID="cri-o://408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33" gracePeriod=30 Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.696639 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-swift-storage-0\") pod \"91668934-529e-4df9-b41f-8cd54e5920ea\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.696728 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6lcx\" (UniqueName: \"kubernetes.io/projected/91668934-529e-4df9-b41f-8cd54e5920ea-kube-api-access-q6lcx\") pod \"91668934-529e-4df9-b41f-8cd54e5920ea\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.696784 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-nb\") pod \"91668934-529e-4df9-b41f-8cd54e5920ea\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.696842 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-config\") pod \"91668934-529e-4df9-b41f-8cd54e5920ea\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.696860 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-svc\") pod \"91668934-529e-4df9-b41f-8cd54e5920ea\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.696889 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-sb\") pod \"91668934-529e-4df9-b41f-8cd54e5920ea\" (UID: \"91668934-529e-4df9-b41f-8cd54e5920ea\") " Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.730752 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91668934-529e-4df9-b41f-8cd54e5920ea-kube-api-access-q6lcx" (OuterVolumeSpecName: "kube-api-access-q6lcx") pod "91668934-529e-4df9-b41f-8cd54e5920ea" (UID: "91668934-529e-4df9-b41f-8cd54e5920ea"). InnerVolumeSpecName "kube-api-access-q6lcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.785222 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "91668934-529e-4df9-b41f-8cd54e5920ea" (UID: "91668934-529e-4df9-b41f-8cd54e5920ea"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.785233 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "91668934-529e-4df9-b41f-8cd54e5920ea" (UID: "91668934-529e-4df9-b41f-8cd54e5920ea"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.791687 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.791767 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.801393 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.801432 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6lcx\" (UniqueName: \"kubernetes.io/projected/91668934-529e-4df9-b41f-8cd54e5920ea-kube-api-access-q6lcx\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.801464 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.805005 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "91668934-529e-4df9-b41f-8cd54e5920ea" (UID: "91668934-529e-4df9-b41f-8cd54e5920ea"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.808641 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "91668934-529e-4df9-b41f-8cd54e5920ea" (UID: "91668934-529e-4df9-b41f-8cd54e5920ea"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.829140 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-config" (OuterVolumeSpecName: "config") pod "91668934-529e-4df9-b41f-8cd54e5920ea" (UID: "91668934-529e-4df9-b41f-8cd54e5920ea"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.902783 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.902833 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.902850 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/91668934-529e-4df9-b41f-8cd54e5920ea-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:08 crc kubenswrapper[4775]: I0127 11:40:08.973442 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.078361 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.106090 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndwhl\" (UniqueName: \"kubernetes.io/projected/a3942760-c6b4-43b5-9680-48d8b8ae3854-kube-api-access-ndwhl\") pod \"a3942760-c6b4-43b5-9680-48d8b8ae3854\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.106152 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-scripts\") pod \"a3942760-c6b4-43b5-9680-48d8b8ae3854\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.106945 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-combined-ca-bundle\") pod \"a3942760-c6b4-43b5-9680-48d8b8ae3854\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.107237 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-config-data\") pod \"a3942760-c6b4-43b5-9680-48d8b8ae3854\" (UID: \"a3942760-c6b4-43b5-9680-48d8b8ae3854\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.110233 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3942760-c6b4-43b5-9680-48d8b8ae3854-kube-api-access-ndwhl" (OuterVolumeSpecName: "kube-api-access-ndwhl") pod "a3942760-c6b4-43b5-9680-48d8b8ae3854" (UID: "a3942760-c6b4-43b5-9680-48d8b8ae3854"). InnerVolumeSpecName "kube-api-access-ndwhl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.110732 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-scripts" (OuterVolumeSpecName: "scripts") pod "a3942760-c6b4-43b5-9680-48d8b8ae3854" (UID: "a3942760-c6b4-43b5-9680-48d8b8ae3854"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.133669 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-config-data" (OuterVolumeSpecName: "config-data") pod "a3942760-c6b4-43b5-9680-48d8b8ae3854" (UID: "a3942760-c6b4-43b5-9680-48d8b8ae3854"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.134949 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3942760-c6b4-43b5-9680-48d8b8ae3854" (UID: "a3942760-c6b4-43b5-9680-48d8b8ae3854"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.195695 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.209212 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndwhl\" (UniqueName: \"kubernetes.io/projected/a3942760-c6b4-43b5-9680-48d8b8ae3854-kube-api-access-ndwhl\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.209246 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.209260 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.209271 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3942760-c6b4-43b5-9680-48d8b8ae3854-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.310150 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-combined-ca-bundle\") pod \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.310549 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxq6w\" (UniqueName: \"kubernetes.io/projected/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-kube-api-access-vxq6w\") pod \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.310659 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-config-data\") pod \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.310702 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-logs\") pod \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.310755 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-nova-metadata-tls-certs\") pod \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\" (UID: \"6a0f23e6-5732-4337-b5fa-d433e99f5cb1\") " Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.311029 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-logs" (OuterVolumeSpecName: "logs") pod "6a0f23e6-5732-4337-b5fa-d433e99f5cb1" (UID: "6a0f23e6-5732-4337-b5fa-d433e99f5cb1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.311591 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.314859 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-kube-api-access-vxq6w" (OuterVolumeSpecName: "kube-api-access-vxq6w") pod "6a0f23e6-5732-4337-b5fa-d433e99f5cb1" (UID: "6a0f23e6-5732-4337-b5fa-d433e99f5cb1"). InnerVolumeSpecName "kube-api-access-vxq6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.334224 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-config-data" (OuterVolumeSpecName: "config-data") pod "6a0f23e6-5732-4337-b5fa-d433e99f5cb1" (UID: "6a0f23e6-5732-4337-b5fa-d433e99f5cb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.336285 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a0f23e6-5732-4337-b5fa-d433e99f5cb1" (UID: "6a0f23e6-5732-4337-b5fa-d433e99f5cb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.358070 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "6a0f23e6-5732-4337-b5fa-d433e99f5cb1" (UID: "6a0f23e6-5732-4337-b5fa-d433e99f5cb1"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.405632 4775 generic.go:334] "Generic (PLEG): container finished" podID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerID="408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33" exitCode=0 Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.405670 4775 generic.go:334] "Generic (PLEG): container finished" podID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerID="b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114" exitCode=143 Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.405759 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.405779 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a0f23e6-5732-4337-b5fa-d433e99f5cb1","Type":"ContainerDied","Data":"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33"} Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.405829 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a0f23e6-5732-4337-b5fa-d433e99f5cb1","Type":"ContainerDied","Data":"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114"} Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.405844 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6a0f23e6-5732-4337-b5fa-d433e99f5cb1","Type":"ContainerDied","Data":"7a575331b653caf550f94723cfa812dc6bcd3d3fd0557ee1db29564218b576be"} Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.405863 4775 scope.go:117] "RemoveContainer" containerID="408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.410691 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" event={"ID":"a3942760-c6b4-43b5-9680-48d8b8ae3854","Type":"ContainerDied","Data":"3875cdbec0530e620f02f1a307c1a67dee8feb8dbd26d732a25f2b80b3893386"} Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.410736 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3875cdbec0530e620f02f1a307c1a67dee8feb8dbd26d732a25f2b80b3893386" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.410821 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-xh4b2" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.419510 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.419538 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxq6w\" (UniqueName: \"kubernetes.io/projected/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-kube-api-access-vxq6w\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.419553 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.419563 4775 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a0f23e6-5732-4337-b5fa-d433e99f5cb1-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.426339 4775 generic.go:334] "Generic (PLEG): container finished" podID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerID="12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8" exitCode=143 Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.426520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8f6cebd-0ba7-4713-906a-f48b094c332b","Type":"ContainerDied","Data":"12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8"} Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.446532 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.449493 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bfc9b94f-2kvdd" event={"ID":"91668934-529e-4df9-b41f-8cd54e5920ea","Type":"ContainerDied","Data":"98a47029353e8ac81c34e8a77e13a6ae144436ae57c8cc4cc8ecca40c93dad8a"} Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.453480 4775 scope.go:117] "RemoveContainer" containerID="b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.508251 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.518771 4775 scope.go:117] "RemoveContainer" containerID="408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33" Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.520147 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33\": container with ID starting with 408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33 not found: ID does not exist" containerID="408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.520202 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33"} err="failed to get container status \"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33\": rpc error: code = NotFound desc = could not find container \"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33\": container with ID starting with 408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33 not found: ID does not exist" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.520236 4775 scope.go:117] "RemoveContainer" containerID="b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114" Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.520579 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114\": container with ID starting with b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114 not found: ID does not exist" containerID="b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.520611 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114"} err="failed to get container status \"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114\": rpc error: code = NotFound desc = could not find container \"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114\": container with ID starting with b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114 not found: ID does not exist" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.520629 4775 scope.go:117] "RemoveContainer" containerID="408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.520986 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33"} err="failed to get container status \"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33\": rpc error: code = NotFound desc = could not find container \"408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33\": container with ID starting with 408d9b5be53136942b7154264e51d426aa741eb252885f4b37dce59685537d33 not found: ID does not exist" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.521007 4775 scope.go:117] "RemoveContainer" containerID="b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.521186 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114"} err="failed to get container status \"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114\": rpc error: code = NotFound desc = could not find container \"b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114\": container with ID starting with b5c848f95efe3f2eff7255a05ae932f47bad8ed27ea031bb865d317eacdab114 not found: ID does not exist" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.521204 4775 scope.go:117] "RemoveContainer" containerID="a0e92df054ede73072c8816014c71d3028937fc797e7a11e419afbd459f2f615" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.537577 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.557629 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.558142 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3942760-c6b4-43b5-9680-48d8b8ae3854" containerName="nova-cell1-conductor-db-sync" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.558168 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3942760-c6b4-43b5-9680-48d8b8ae3854" containerName="nova-cell1-conductor-db-sync" Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.558208 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91668934-529e-4df9-b41f-8cd54e5920ea" containerName="init" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.558216 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="91668934-529e-4df9-b41f-8cd54e5920ea" containerName="init" Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.558235 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8726531a-a74e-48cd-a274-6f67ae507560" containerName="nova-manage" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.558243 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8726531a-a74e-48cd-a274-6f67ae507560" containerName="nova-manage" Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.558257 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-log" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.558264 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-log" Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.558279 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-metadata" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.558289 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-metadata" Jan 27 11:40:09 crc kubenswrapper[4775]: E0127 11:40:09.558301 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91668934-529e-4df9-b41f-8cd54e5920ea" containerName="dnsmasq-dns" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.558308 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="91668934-529e-4df9-b41f-8cd54e5920ea" containerName="dnsmasq-dns" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.560116 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-metadata" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.560146 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="91668934-529e-4df9-b41f-8cd54e5920ea" containerName="dnsmasq-dns" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.560159 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3942760-c6b4-43b5-9680-48d8b8ae3854" containerName="nova-cell1-conductor-db-sync" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.560173 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8726531a-a74e-48cd-a274-6f67ae507560" containerName="nova-manage" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.560180 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" containerName="nova-metadata-log" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.565702 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.570390 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.571869 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.575930 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.576326 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.576731 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.577618 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.579038 4775 scope.go:117] "RemoveContainer" containerID="88473ae1a8fc90fa959a314a4a49d93772825f6cd05e1adb0fc249904b937add" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.595527 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.606948 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-2kvdd"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.618668 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bfc9b94f-2kvdd"] Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.624499 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7swxn\" (UniqueName: \"kubernetes.io/projected/7916937d-e997-4d88-8a6b-9fecf57f6828-kube-api-access-7swxn\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.624561 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-config-data\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.624593 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.624637 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.624770 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb2kw\" (UniqueName: \"kubernetes.io/projected/f2945fbf-3178-420a-bfaf-d0d9c91d610a-kube-api-access-fb2kw\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.624882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7916937d-e997-4d88-8a6b-9fecf57f6828-logs\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.624998 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.625020 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726630 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7swxn\" (UniqueName: \"kubernetes.io/projected/7916937d-e997-4d88-8a6b-9fecf57f6828-kube-api-access-7swxn\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726707 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-config-data\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726737 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726783 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726806 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb2kw\" (UniqueName: \"kubernetes.io/projected/f2945fbf-3178-420a-bfaf-d0d9c91d610a-kube-api-access-fb2kw\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726839 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7916937d-e997-4d88-8a6b-9fecf57f6828-logs\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726878 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.726894 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.727414 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7916937d-e997-4d88-8a6b-9fecf57f6828-logs\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.730701 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.731293 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.732099 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-config-data\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.739534 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.740022 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.742319 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb2kw\" (UniqueName: \"kubernetes.io/projected/f2945fbf-3178-420a-bfaf-d0d9c91d610a-kube-api-access-fb2kw\") pod \"nova-cell1-conductor-0\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.743233 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7swxn\" (UniqueName: \"kubernetes.io/projected/7916937d-e997-4d88-8a6b-9fecf57f6828-kube-api-access-7swxn\") pod \"nova-metadata-0\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " pod="openstack/nova-metadata-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.760583 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a0f23e6-5732-4337-b5fa-d433e99f5cb1" path="/var/lib/kubelet/pods/6a0f23e6-5732-4337-b5fa-d433e99f5cb1/volumes" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.761336 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91668934-529e-4df9-b41f-8cd54e5920ea" path="/var/lib/kubelet/pods/91668934-529e-4df9-b41f-8cd54e5920ea/volumes" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.900739 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:09 crc kubenswrapper[4775]: I0127 11:40:09.910829 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:10 crc kubenswrapper[4775]: I0127 11:40:10.420773 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:10 crc kubenswrapper[4775]: W0127 11:40:10.423878 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7916937d_e997_4d88_8a6b_9fecf57f6828.slice/crio-659a755d8117f607fb5b143fa1ae054d06f0293125a832cbe5f099c7b00e97ab WatchSource:0}: Error finding container 659a755d8117f607fb5b143fa1ae054d06f0293125a832cbe5f099c7b00e97ab: Status 404 returned error can't find the container with id 659a755d8117f607fb5b143fa1ae054d06f0293125a832cbe5f099c7b00e97ab Jan 27 11:40:10 crc kubenswrapper[4775]: I0127 11:40:10.455412 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7916937d-e997-4d88-8a6b-9fecf57f6828","Type":"ContainerStarted","Data":"659a755d8117f607fb5b143fa1ae054d06f0293125a832cbe5f099c7b00e97ab"} Jan 27 11:40:10 crc kubenswrapper[4775]: I0127 11:40:10.458533 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d0c2fced-9c0a-4cef-90ed-d6429ee82751" containerName="nova-scheduler-scheduler" containerID="cri-o://c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d" gracePeriod=30 Jan 27 11:40:10 crc kubenswrapper[4775]: I0127 11:40:10.486031 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:40:10 crc kubenswrapper[4775]: W0127 11:40:10.493978 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2945fbf_3178_420a_bfaf_d0d9c91d610a.slice/crio-f094ab71659251ae7c395f5253917a11c7fff1315a6946167b9b612c28b6876f WatchSource:0}: Error finding container f094ab71659251ae7c395f5253917a11c7fff1315a6946167b9b612c28b6876f: Status 404 returned error can't find the container with id f094ab71659251ae7c395f5253917a11c7fff1315a6946167b9b612c28b6876f Jan 27 11:40:11 crc kubenswrapper[4775]: I0127 11:40:11.467750 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7916937d-e997-4d88-8a6b-9fecf57f6828","Type":"ContainerStarted","Data":"b680860e2593d7ee3bb455ce65bb0c417d6d9c265106d69c11a3f6d5c337e06f"} Jan 27 11:40:11 crc kubenswrapper[4775]: I0127 11:40:11.467791 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7916937d-e997-4d88-8a6b-9fecf57f6828","Type":"ContainerStarted","Data":"8afc04127ae5dac867cf7f5463a37db08396e7d83dca005132a5f83a2ea9896d"} Jan 27 11:40:11 crc kubenswrapper[4775]: I0127 11:40:11.474376 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f2945fbf-3178-420a-bfaf-d0d9c91d610a","Type":"ContainerStarted","Data":"178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a"} Jan 27 11:40:11 crc kubenswrapper[4775]: I0127 11:40:11.474467 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f2945fbf-3178-420a-bfaf-d0d9c91d610a","Type":"ContainerStarted","Data":"f094ab71659251ae7c395f5253917a11c7fff1315a6946167b9b612c28b6876f"} Jan 27 11:40:11 crc kubenswrapper[4775]: I0127 11:40:11.476234 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:11 crc kubenswrapper[4775]: I0127 11:40:11.500069 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.500052801 podStartE2EDuration="2.500052801s" podCreationTimestamp="2026-01-27 11:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:11.497360806 +0000 UTC m=+1190.638958583" watchObservedRunningTime="2026-01-27 11:40:11.500052801 +0000 UTC m=+1190.641650578" Jan 27 11:40:11 crc kubenswrapper[4775]: I0127 11:40:11.527633 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.5276122130000003 podStartE2EDuration="2.527612213s" podCreationTimestamp="2026-01-27 11:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:11.523836409 +0000 UTC m=+1190.665434206" watchObservedRunningTime="2026-01-27 11:40:11.527612213 +0000 UTC m=+1190.669209980" Jan 27 11:40:12 crc kubenswrapper[4775]: E0127 11:40:12.947298 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 11:40:12 crc kubenswrapper[4775]: E0127 11:40:12.948715 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 11:40:12 crc kubenswrapper[4775]: E0127 11:40:12.955935 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 11:40:12 crc kubenswrapper[4775]: E0127 11:40:12.955976 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d0c2fced-9c0a-4cef-90ed-d6429ee82751" containerName="nova-scheduler-scheduler" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.490357 4775 generic.go:334] "Generic (PLEG): container finished" podID="d0c2fced-9c0a-4cef-90ed-d6429ee82751" containerID="c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d" exitCode=0 Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.490477 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0c2fced-9c0a-4cef-90ed-d6429ee82751","Type":"ContainerDied","Data":"c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d"} Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.490733 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0c2fced-9c0a-4cef-90ed-d6429ee82751","Type":"ContainerDied","Data":"6cc0b4e0ee3ee3d1c37a3c99f72cc692ec43b7950501d86f96aae8aad30b9516"} Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.490758 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cc0b4e0ee3ee3d1c37a3c99f72cc692ec43b7950501d86f96aae8aad30b9516" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.525072 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.610289 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-combined-ca-bundle\") pod \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.610436 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc8nj\" (UniqueName: \"kubernetes.io/projected/d0c2fced-9c0a-4cef-90ed-d6429ee82751-kube-api-access-dc8nj\") pod \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.610530 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-config-data\") pod \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\" (UID: \"d0c2fced-9c0a-4cef-90ed-d6429ee82751\") " Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.616604 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0c2fced-9c0a-4cef-90ed-d6429ee82751-kube-api-access-dc8nj" (OuterVolumeSpecName: "kube-api-access-dc8nj") pod "d0c2fced-9c0a-4cef-90ed-d6429ee82751" (UID: "d0c2fced-9c0a-4cef-90ed-d6429ee82751"). InnerVolumeSpecName "kube-api-access-dc8nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.635760 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-config-data" (OuterVolumeSpecName: "config-data") pod "d0c2fced-9c0a-4cef-90ed-d6429ee82751" (UID: "d0c2fced-9c0a-4cef-90ed-d6429ee82751"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.637637 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0c2fced-9c0a-4cef-90ed-d6429ee82751" (UID: "d0c2fced-9c0a-4cef-90ed-d6429ee82751"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.712371 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc8nj\" (UniqueName: \"kubernetes.io/projected/d0c2fced-9c0a-4cef-90ed-d6429ee82751-kube-api-access-dc8nj\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.712404 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:13 crc kubenswrapper[4775]: I0127 11:40:13.712415 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0c2fced-9c0a-4cef-90ed-d6429ee82751-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.464788 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.500602 4775 generic.go:334] "Generic (PLEG): container finished" podID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerID="a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d" exitCode=0 Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.500694 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.501646 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.501922 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8f6cebd-0ba7-4713-906a-f48b094c332b","Type":"ContainerDied","Data":"a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d"} Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.501963 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8f6cebd-0ba7-4713-906a-f48b094c332b","Type":"ContainerDied","Data":"234a0c458af079345c8244f18b5988ff4056e6dc102c0dcac07740fc8e5eeb55"} Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.501984 4775 scope.go:117] "RemoveContainer" containerID="a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.528951 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.529107 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8f6cebd-0ba7-4713-906a-f48b094c332b-logs\") pod \"d8f6cebd-0ba7-4713-906a-f48b094c332b\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.530736 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8f6cebd-0ba7-4713-906a-f48b094c332b-logs" (OuterVolumeSpecName: "logs") pod "d8f6cebd-0ba7-4713-906a-f48b094c332b" (UID: "d8f6cebd-0ba7-4713-906a-f48b094c332b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.530887 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-config-data\") pod \"d8f6cebd-0ba7-4713-906a-f48b094c332b\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.530975 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-combined-ca-bundle\") pod \"d8f6cebd-0ba7-4713-906a-f48b094c332b\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.531089 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zjnb\" (UniqueName: \"kubernetes.io/projected/d8f6cebd-0ba7-4713-906a-f48b094c332b-kube-api-access-8zjnb\") pod \"d8f6cebd-0ba7-4713-906a-f48b094c332b\" (UID: \"d8f6cebd-0ba7-4713-906a-f48b094c332b\") " Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.533315 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8f6cebd-0ba7-4713-906a-f48b094c332b-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.546730 4775 scope.go:117] "RemoveContainer" containerID="12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.573855 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f6cebd-0ba7-4713-906a-f48b094c332b-kube-api-access-8zjnb" (OuterVolumeSpecName: "kube-api-access-8zjnb") pod "d8f6cebd-0ba7-4713-906a-f48b094c332b" (UID: "d8f6cebd-0ba7-4713-906a-f48b094c332b"). InnerVolumeSpecName "kube-api-access-8zjnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.573996 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-config-data" (OuterVolumeSpecName: "config-data") pod "d8f6cebd-0ba7-4713-906a-f48b094c332b" (UID: "d8f6cebd-0ba7-4713-906a-f48b094c332b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.574037 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.576141 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8f6cebd-0ba7-4713-906a-f48b094c332b" (UID: "d8f6cebd-0ba7-4713-906a-f48b094c332b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.582869 4775 scope.go:117] "RemoveContainer" containerID="a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d" Jan 27 11:40:14 crc kubenswrapper[4775]: E0127 11:40:14.583643 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d\": container with ID starting with a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d not found: ID does not exist" containerID="a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.583685 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d"} err="failed to get container status \"a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d\": rpc error: code = NotFound desc = could not find container \"a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d\": container with ID starting with a87f90296bfd05cb27cef3ac314414bee645b80295ae4056d35b742f989c4a4d not found: ID does not exist" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.583705 4775 scope.go:117] "RemoveContainer" containerID="12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8" Jan 27 11:40:14 crc kubenswrapper[4775]: E0127 11:40:14.583949 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8\": container with ID starting with 12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8 not found: ID does not exist" containerID="12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.583965 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8"} err="failed to get container status \"12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8\": rpc error: code = NotFound desc = could not find container \"12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8\": container with ID starting with 12f1900c309caf7292e2f714a53f2174d15ca569f288959bfdf4af37418847b8 not found: ID does not exist" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.601814 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: E0127 11:40:14.602194 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-log" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.602213 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-log" Jan 27 11:40:14 crc kubenswrapper[4775]: E0127 11:40:14.602226 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-api" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.602232 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-api" Jan 27 11:40:14 crc kubenswrapper[4775]: E0127 11:40:14.602250 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0c2fced-9c0a-4cef-90ed-d6429ee82751" containerName="nova-scheduler-scheduler" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.602256 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0c2fced-9c0a-4cef-90ed-d6429ee82751" containerName="nova-scheduler-scheduler" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.602410 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0c2fced-9c0a-4cef-90ed-d6429ee82751" containerName="nova-scheduler-scheduler" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.602424 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-api" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.602438 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" containerName="nova-api-log" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.603064 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.605375 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.613947 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.635121 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.635164 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8f6cebd-0ba7-4713-906a-f48b094c332b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.635178 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zjnb\" (UniqueName: \"kubernetes.io/projected/d8f6cebd-0ba7-4713-906a-f48b094c332b-kube-api-access-8zjnb\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.737614 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-config-data\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.737721 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mmbm\" (UniqueName: \"kubernetes.io/projected/b76fecdf-e253-454b-8e4e-4c9109834188-kube-api-access-9mmbm\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.737779 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.838615 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.839958 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-config-data\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.840030 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mmbm\" (UniqueName: \"kubernetes.io/projected/b76fecdf-e253-454b-8e4e-4c9109834188-kube-api-access-9mmbm\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.840083 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.844190 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.847181 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-config-data\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.854410 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.862161 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mmbm\" (UniqueName: \"kubernetes.io/projected/b76fecdf-e253-454b-8e4e-4c9109834188-kube-api-access-9mmbm\") pod \"nova-scheduler-0\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.866862 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.868525 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.870730 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.883849 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.912307 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.912609 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.942113 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzc4q\" (UniqueName: \"kubernetes.io/projected/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-kube-api-access-zzc4q\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.942557 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-logs\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.942583 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.942663 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-config-data\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:14 crc kubenswrapper[4775]: I0127 11:40:14.970580 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.050784 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-logs\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.050830 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.050866 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-config-data\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.050954 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzc4q\" (UniqueName: \"kubernetes.io/projected/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-kube-api-access-zzc4q\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.051976 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-logs\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.056403 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.064303 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-config-data\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.075954 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzc4q\" (UniqueName: \"kubernetes.io/projected/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-kube-api-access-zzc4q\") pod \"nova-api-0\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.190693 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.383179 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:15 crc kubenswrapper[4775]: W0127 11:40:15.399016 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb76fecdf_e253_454b_8e4e_4c9109834188.slice/crio-c155607c7d182e5993c92c25a4eb742bffffa860a5fe816b477ae3783c2ec4bb WatchSource:0}: Error finding container c155607c7d182e5993c92c25a4eb742bffffa860a5fe816b477ae3783c2ec4bb: Status 404 returned error can't find the container with id c155607c7d182e5993c92c25a4eb742bffffa860a5fe816b477ae3783c2ec4bb Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.513948 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b76fecdf-e253-454b-8e4e-4c9109834188","Type":"ContainerStarted","Data":"c155607c7d182e5993c92c25a4eb742bffffa860a5fe816b477ae3783c2ec4bb"} Jan 27 11:40:15 crc kubenswrapper[4775]: W0127 11:40:15.606649 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3bcd5d9_85e4_4754_b39f_17ee05c9991e.slice/crio-e5afc5e5b5021351beb662aa89d06ff2baa448336af2777de4b79215ed4b77f0 WatchSource:0}: Error finding container e5afc5e5b5021351beb662aa89d06ff2baa448336af2777de4b79215ed4b77f0: Status 404 returned error can't find the container with id e5afc5e5b5021351beb662aa89d06ff2baa448336af2777de4b79215ed4b77f0 Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.606974 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.756396 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0c2fced-9c0a-4cef-90ed-d6429ee82751" path="/var/lib/kubelet/pods/d0c2fced-9c0a-4cef-90ed-d6429ee82751/volumes" Jan 27 11:40:15 crc kubenswrapper[4775]: I0127 11:40:15.757105 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f6cebd-0ba7-4713-906a-f48b094c332b" path="/var/lib/kubelet/pods/d8f6cebd-0ba7-4713-906a-f48b094c332b/volumes" Jan 27 11:40:16 crc kubenswrapper[4775]: I0127 11:40:16.523913 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b76fecdf-e253-454b-8e4e-4c9109834188","Type":"ContainerStarted","Data":"5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e"} Jan 27 11:40:16 crc kubenswrapper[4775]: I0127 11:40:16.526255 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3bcd5d9-85e4-4754-b39f-17ee05c9991e","Type":"ContainerStarted","Data":"3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a"} Jan 27 11:40:16 crc kubenswrapper[4775]: I0127 11:40:16.526298 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3bcd5d9-85e4-4754-b39f-17ee05c9991e","Type":"ContainerStarted","Data":"e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07"} Jan 27 11:40:16 crc kubenswrapper[4775]: I0127 11:40:16.526312 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3bcd5d9-85e4-4754-b39f-17ee05c9991e","Type":"ContainerStarted","Data":"e5afc5e5b5021351beb662aa89d06ff2baa448336af2777de4b79215ed4b77f0"} Jan 27 11:40:16 crc kubenswrapper[4775]: I0127 11:40:16.551963 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.551945742 podStartE2EDuration="2.551945742s" podCreationTimestamp="2026-01-27 11:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:16.540531547 +0000 UTC m=+1195.682129324" watchObservedRunningTime="2026-01-27 11:40:16.551945742 +0000 UTC m=+1195.693543519" Jan 27 11:40:16 crc kubenswrapper[4775]: I0127 11:40:16.565564 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.565544998 podStartE2EDuration="2.565544998s" podCreationTimestamp="2026-01-27 11:40:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:16.559188073 +0000 UTC m=+1195.700785850" watchObservedRunningTime="2026-01-27 11:40:16.565544998 +0000 UTC m=+1195.707142775" Jan 27 11:40:19 crc kubenswrapper[4775]: I0127 11:40:19.911995 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 11:40:19 crc kubenswrapper[4775]: I0127 11:40:19.912377 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 11:40:19 crc kubenswrapper[4775]: I0127 11:40:19.935956 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 27 11:40:19 crc kubenswrapper[4775]: I0127 11:40:19.970719 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 11:40:20 crc kubenswrapper[4775]: I0127 11:40:20.511335 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 27 11:40:20 crc kubenswrapper[4775]: I0127 11:40:20.923598 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:40:20 crc kubenswrapper[4775]: I0127 11:40:20.923626 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 11:40:24 crc kubenswrapper[4775]: I0127 11:40:24.971369 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.017701 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.111163 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.111697 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d650e06f-8d9a-443d-9045-82cef3c36ad3" containerName="kube-state-metrics" containerID="cri-o://41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888" gracePeriod=30 Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.193088 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.193159 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.593952 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.634992 4775 generic.go:334] "Generic (PLEG): container finished" podID="d650e06f-8d9a-443d-9045-82cef3c36ad3" containerID="41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888" exitCode=2 Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.635895 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.636305 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d650e06f-8d9a-443d-9045-82cef3c36ad3","Type":"ContainerDied","Data":"41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888"} Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.636337 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d650e06f-8d9a-443d-9045-82cef3c36ad3","Type":"ContainerDied","Data":"8c517699b915acc52e0019dc1c45d2e9a3ea6904e06f7498f332512ca9be5304"} Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.636367 4775 scope.go:117] "RemoveContainer" containerID="41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.657891 4775 scope.go:117] "RemoveContainer" containerID="41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888" Jan 27 11:40:25 crc kubenswrapper[4775]: E0127 11:40:25.658271 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888\": container with ID starting with 41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888 not found: ID does not exist" containerID="41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.658311 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888"} err="failed to get container status \"41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888\": rpc error: code = NotFound desc = could not find container \"41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888\": container with ID starting with 41f01f5c05056d84d7dfaf2c3479c52ee9bd356d470405874f151bfd9ab81888 not found: ID does not exist" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.659795 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzdzx\" (UniqueName: \"kubernetes.io/projected/d650e06f-8d9a-443d-9045-82cef3c36ad3-kube-api-access-zzdzx\") pod \"d650e06f-8d9a-443d-9045-82cef3c36ad3\" (UID: \"d650e06f-8d9a-443d-9045-82cef3c36ad3\") " Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.668141 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.668167 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d650e06f-8d9a-443d-9045-82cef3c36ad3-kube-api-access-zzdzx" (OuterVolumeSpecName: "kube-api-access-zzdzx") pod "d650e06f-8d9a-443d-9045-82cef3c36ad3" (UID: "d650e06f-8d9a-443d-9045-82cef3c36ad3"). InnerVolumeSpecName "kube-api-access-zzdzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.762223 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzdzx\" (UniqueName: \"kubernetes.io/projected/d650e06f-8d9a-443d-9045-82cef3c36ad3-kube-api-access-zzdzx\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.964147 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.974717 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.993354 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:40:25 crc kubenswrapper[4775]: E0127 11:40:25.993755 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d650e06f-8d9a-443d-9045-82cef3c36ad3" containerName="kube-state-metrics" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.993772 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d650e06f-8d9a-443d-9045-82cef3c36ad3" containerName="kube-state-metrics" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.993937 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d650e06f-8d9a-443d-9045-82cef3c36ad3" containerName="kube-state-metrics" Jan 27 11:40:25 crc kubenswrapper[4775]: I0127 11:40:25.994521 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.000535 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.001123 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.016831 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.069585 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.069657 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.069750 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8vm4\" (UniqueName: \"kubernetes.io/projected/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-api-access-h8vm4\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.069777 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.171696 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.171763 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.171831 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8vm4\" (UniqueName: \"kubernetes.io/projected/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-api-access-h8vm4\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.171856 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.175512 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.177010 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.185412 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.192338 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8vm4\" (UniqueName: \"kubernetes.io/projected/7aa68248-0707-4f5c-8689-57cf6d07c250-kube-api-access-h8vm4\") pod \"kube-state-metrics-0\" (UID: \"7aa68248-0707-4f5c-8689-57cf6d07c250\") " pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.276615 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.276622 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.212:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.312887 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 27 11:40:26 crc kubenswrapper[4775]: I0127 11:40:26.823371 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.187188 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.193353 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-central-agent" containerID="cri-o://3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2" gracePeriod=30 Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.193833 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="proxy-httpd" containerID="cri-o://eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261" gracePeriod=30 Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.193907 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="sg-core" containerID="cri-o://8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d" gracePeriod=30 Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.193946 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-notification-agent" containerID="cri-o://43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005" gracePeriod=30 Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.658077 4775 generic.go:334] "Generic (PLEG): container finished" podID="877bcef1-579c-413c-a0c0-6dad63885091" containerID="eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261" exitCode=0 Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.658405 4775 generic.go:334] "Generic (PLEG): container finished" podID="877bcef1-579c-413c-a0c0-6dad63885091" containerID="8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d" exitCode=2 Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.658416 4775 generic.go:334] "Generic (PLEG): container finished" podID="877bcef1-579c-413c-a0c0-6dad63885091" containerID="3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2" exitCode=0 Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.658146 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerDied","Data":"eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261"} Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.658500 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerDied","Data":"8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d"} Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.658514 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerDied","Data":"3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2"} Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.660375 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7aa68248-0707-4f5c-8689-57cf6d07c250","Type":"ContainerStarted","Data":"00a80a967ddb6eab2e4de0d664cc76e1e667eaa84ad42ded92f09ec9ae23383c"} Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.660406 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7aa68248-0707-4f5c-8689-57cf6d07c250","Type":"ContainerStarted","Data":"4e5b8f5a86f4b65ac4cff674b3b701010564e4e4d052eb787ecc2de495fba9f1"} Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.660497 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.674896 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.325264385 podStartE2EDuration="2.674879045s" podCreationTimestamp="2026-01-27 11:40:25 +0000 UTC" firstStartedPulling="2026-01-27 11:40:26.827516833 +0000 UTC m=+1205.969114600" lastFinishedPulling="2026-01-27 11:40:27.177131483 +0000 UTC m=+1206.318729260" observedRunningTime="2026-01-27 11:40:27.672795637 +0000 UTC m=+1206.814393434" watchObservedRunningTime="2026-01-27 11:40:27.674879045 +0000 UTC m=+1206.816476822" Jan 27 11:40:27 crc kubenswrapper[4775]: I0127 11:40:27.755576 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d650e06f-8d9a-443d-9045-82cef3c36ad3" path="/var/lib/kubelet/pods/d650e06f-8d9a-443d-9045-82cef3c36ad3/volumes" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.304353 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.434671 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-sg-core-conf-yaml\") pod \"877bcef1-579c-413c-a0c0-6dad63885091\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.434783 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44vhv\" (UniqueName: \"kubernetes.io/projected/877bcef1-579c-413c-a0c0-6dad63885091-kube-api-access-44vhv\") pod \"877bcef1-579c-413c-a0c0-6dad63885091\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.434830 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-log-httpd\") pod \"877bcef1-579c-413c-a0c0-6dad63885091\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.434880 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-run-httpd\") pod \"877bcef1-579c-413c-a0c0-6dad63885091\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.434937 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-combined-ca-bundle\") pod \"877bcef1-579c-413c-a0c0-6dad63885091\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.434979 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-config-data\") pod \"877bcef1-579c-413c-a0c0-6dad63885091\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.435031 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-scripts\") pod \"877bcef1-579c-413c-a0c0-6dad63885091\" (UID: \"877bcef1-579c-413c-a0c0-6dad63885091\") " Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.435467 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "877bcef1-579c-413c-a0c0-6dad63885091" (UID: "877bcef1-579c-413c-a0c0-6dad63885091"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.435500 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "877bcef1-579c-413c-a0c0-6dad63885091" (UID: "877bcef1-579c-413c-a0c0-6dad63885091"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.440049 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/877bcef1-579c-413c-a0c0-6dad63885091-kube-api-access-44vhv" (OuterVolumeSpecName: "kube-api-access-44vhv") pod "877bcef1-579c-413c-a0c0-6dad63885091" (UID: "877bcef1-579c-413c-a0c0-6dad63885091"). InnerVolumeSpecName "kube-api-access-44vhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.446569 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-scripts" (OuterVolumeSpecName: "scripts") pod "877bcef1-579c-413c-a0c0-6dad63885091" (UID: "877bcef1-579c-413c-a0c0-6dad63885091"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.471253 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "877bcef1-579c-413c-a0c0-6dad63885091" (UID: "877bcef1-579c-413c-a0c0-6dad63885091"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.505591 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "877bcef1-579c-413c-a0c0-6dad63885091" (UID: "877bcef1-579c-413c-a0c0-6dad63885091"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.537434 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.537480 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.537507 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.537516 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.537524 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44vhv\" (UniqueName: \"kubernetes.io/projected/877bcef1-579c-413c-a0c0-6dad63885091-kube-api-access-44vhv\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.537532 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/877bcef1-579c-413c-a0c0-6dad63885091-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.543663 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-config-data" (OuterVolumeSpecName: "config-data") pod "877bcef1-579c-413c-a0c0-6dad63885091" (UID: "877bcef1-579c-413c-a0c0-6dad63885091"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.638927 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/877bcef1-579c-413c-a0c0-6dad63885091-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.684443 4775 generic.go:334] "Generic (PLEG): container finished" podID="877bcef1-579c-413c-a0c0-6dad63885091" containerID="43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005" exitCode=0 Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.684490 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerDied","Data":"43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005"} Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.684530 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"877bcef1-579c-413c-a0c0-6dad63885091","Type":"ContainerDied","Data":"eeb5d6eb3865672e5d710d66ff273bcee9e0b5353cef376cf3d7740ea7501229"} Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.684548 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.684565 4775 scope.go:117] "RemoveContainer" containerID="eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.711799 4775 scope.go:117] "RemoveContainer" containerID="8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.721505 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.730214 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.741034 4775 scope.go:117] "RemoveContainer" containerID="43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.756674 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="877bcef1-579c-413c-a0c0-6dad63885091" path="/var/lib/kubelet/pods/877bcef1-579c-413c-a0c0-6dad63885091/volumes" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.757441 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.757800 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-notification-agent" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.757818 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-notification-agent" Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.757837 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-central-agent" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.757845 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-central-agent" Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.757857 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="proxy-httpd" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.757864 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="proxy-httpd" Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.757892 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="sg-core" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.757901 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="sg-core" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.758082 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-central-agent" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.758109 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="sg-core" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.758121 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="ceilometer-notification-agent" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.758130 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="877bcef1-579c-413c-a0c0-6dad63885091" containerName="proxy-httpd" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.766978 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.768682 4775 scope.go:117] "RemoveContainer" containerID="3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.770815 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.770965 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.780975 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.796507 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.806915 4775 scope.go:117] "RemoveContainer" containerID="eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261" Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.807421 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261\": container with ID starting with eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261 not found: ID does not exist" containerID="eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.807479 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261"} err="failed to get container status \"eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261\": rpc error: code = NotFound desc = could not find container \"eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261\": container with ID starting with eff659babc32075216c9598b9519e59a4fdf29ab5e7ce5862c65ff474c7c3261 not found: ID does not exist" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.807508 4775 scope.go:117] "RemoveContainer" containerID="8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d" Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.810887 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d\": container with ID starting with 8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d not found: ID does not exist" containerID="8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.810943 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d"} err="failed to get container status \"8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d\": rpc error: code = NotFound desc = could not find container \"8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d\": container with ID starting with 8f09bfef702f3a12ff1001ec860e9fd01e77b9d5136061da4ea689325e205c1d not found: ID does not exist" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.810972 4775 scope.go:117] "RemoveContainer" containerID="43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005" Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.811265 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005\": container with ID starting with 43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005 not found: ID does not exist" containerID="43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.811295 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005"} err="failed to get container status \"43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005\": rpc error: code = NotFound desc = could not find container \"43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005\": container with ID starting with 43f55b83de96b10d81cf4a4e657396d176e1d4cf8de47eb33e07b722b317a005 not found: ID does not exist" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.811315 4775 scope.go:117] "RemoveContainer" containerID="3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2" Jan 27 11:40:29 crc kubenswrapper[4775]: E0127 11:40:29.811650 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2\": container with ID starting with 3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2 not found: ID does not exist" containerID="3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.811672 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2"} err="failed to get container status \"3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2\": rpc error: code = NotFound desc = could not find container \"3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2\": container with ID starting with 3ba28cfdd4bb584dbb1a0efcde783ac7495bcc5cb4b45494387da9d66d2992c2 not found: ID does not exist" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842047 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-config-data\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842125 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842147 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-scripts\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842202 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-log-httpd\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842239 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-run-httpd\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842261 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnhfs\" (UniqueName: \"kubernetes.io/projected/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-kube-api-access-dnhfs\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842301 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.842345 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.917104 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.919089 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.924392 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943571 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-run-httpd\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943636 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnhfs\" (UniqueName: \"kubernetes.io/projected/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-kube-api-access-dnhfs\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943701 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943763 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943824 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-config-data\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943859 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943889 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-scripts\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.943949 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-log-httpd\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.944071 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-run-httpd\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.944402 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-log-httpd\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.947902 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.948330 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.948569 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.949392 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-scripts\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.950048 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-config-data\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:29 crc kubenswrapper[4775]: I0127 11:40:29.967390 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnhfs\" (UniqueName: \"kubernetes.io/projected/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-kube-api-access-dnhfs\") pod \"ceilometer-0\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " pod="openstack/ceilometer-0" Jan 27 11:40:30 crc kubenswrapper[4775]: I0127 11:40:30.093653 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:30 crc kubenswrapper[4775]: W0127 11:40:30.597012 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e2cdf2_0f89_471a_b0b4_a6437dfc7428.slice/crio-2720de5b9b26ed69e8ff5d7378ac05437110fdc5c5ce3bf22870368966b52b2b WatchSource:0}: Error finding container 2720de5b9b26ed69e8ff5d7378ac05437110fdc5c5ce3bf22870368966b52b2b: Status 404 returned error can't find the container with id 2720de5b9b26ed69e8ff5d7378ac05437110fdc5c5ce3bf22870368966b52b2b Jan 27 11:40:30 crc kubenswrapper[4775]: I0127 11:40:30.604988 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:30 crc kubenswrapper[4775]: I0127 11:40:30.693721 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerStarted","Data":"2720de5b9b26ed69e8ff5d7378ac05437110fdc5c5ce3bf22870368966b52b2b"} Jan 27 11:40:30 crc kubenswrapper[4775]: I0127 11:40:30.701550 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:31.703816 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerStarted","Data":"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d"} Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.712596 4775 generic.go:334] "Generic (PLEG): container finished" podID="d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" containerID="5f66195a27d4424e7e63c73f2e82e91d3646c082443a037a0bda03b3cefa73cf" exitCode=137 Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.712681 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8","Type":"ContainerDied","Data":"5f66195a27d4424e7e63c73f2e82e91d3646c082443a037a0bda03b3cefa73cf"} Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.712857 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8","Type":"ContainerDied","Data":"ed92b0535d28c0e558eeedf3ab4bfde4b43bbb5e6bbdcef58e08c4e58984f177"} Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.712870 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed92b0535d28c0e558eeedf3ab4bfde4b43bbb5e6bbdcef58e08c4e58984f177" Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.760067 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.908225 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-config-data\") pod \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.908352 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmdhc\" (UniqueName: \"kubernetes.io/projected/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-kube-api-access-nmdhc\") pod \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.908474 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-combined-ca-bundle\") pod \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\" (UID: \"d9c0a867-6f9b-4f43-a18e-0c05e79f16a8\") " Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.914337 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-kube-api-access-nmdhc" (OuterVolumeSpecName: "kube-api-access-nmdhc") pod "d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" (UID: "d9c0a867-6f9b-4f43-a18e-0c05e79f16a8"). InnerVolumeSpecName "kube-api-access-nmdhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.935165 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" (UID: "d9c0a867-6f9b-4f43-a18e-0c05e79f16a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:32 crc kubenswrapper[4775]: I0127 11:40:32.936321 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-config-data" (OuterVolumeSpecName: "config-data") pod "d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" (UID: "d9c0a867-6f9b-4f43-a18e-0c05e79f16a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.011899 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.011959 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmdhc\" (UniqueName: \"kubernetes.io/projected/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-kube-api-access-nmdhc\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.011982 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.725560 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerStarted","Data":"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b"} Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.725609 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.770398 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.785488 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.796414 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:40:33 crc kubenswrapper[4775]: E0127 11:40:33.797861 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.797897 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.798136 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.798881 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.807498 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.809666 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.810080 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.810505 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.928132 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cv7d\" (UniqueName: \"kubernetes.io/projected/6628a06a-9e13-4402-94d9-1df5c42e3c7a-kube-api-access-7cv7d\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.928173 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.928206 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.928329 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:33 crc kubenswrapper[4775]: I0127 11:40:33.928420 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.029534 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.029637 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cv7d\" (UniqueName: \"kubernetes.io/projected/6628a06a-9e13-4402-94d9-1df5c42e3c7a-kube-api-access-7cv7d\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.029693 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.029728 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.029768 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.034414 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.036015 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.039888 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.044918 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.045392 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cv7d\" (UniqueName: \"kubernetes.io/projected/6628a06a-9e13-4402-94d9-1df5c42e3c7a-kube-api-access-7cv7d\") pod \"nova-cell1-novncproxy-0\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.134036 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.588956 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:40:34 crc kubenswrapper[4775]: W0127 11:40:34.594758 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6628a06a_9e13_4402_94d9_1df5c42e3c7a.slice/crio-fafac5d47be64962872bf10acf0347810c872eb880366bed3a34d442b4601ca2 WatchSource:0}: Error finding container fafac5d47be64962872bf10acf0347810c872eb880366bed3a34d442b4601ca2: Status 404 returned error can't find the container with id fafac5d47be64962872bf10acf0347810c872eb880366bed3a34d442b4601ca2 Jan 27 11:40:34 crc kubenswrapper[4775]: I0127 11:40:34.735402 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6628a06a-9e13-4402-94d9-1df5c42e3c7a","Type":"ContainerStarted","Data":"fafac5d47be64962872bf10acf0347810c872eb880366bed3a34d442b4601ca2"} Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.194693 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.195208 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.199556 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.202069 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.754375 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9c0a867-6f9b-4f43-a18e-0c05e79f16a8" path="/var/lib/kubelet/pods/d9c0a867-6f9b-4f43-a18e-0c05e79f16a8/volumes" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.755051 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6628a06a-9e13-4402-94d9-1df5c42e3c7a","Type":"ContainerStarted","Data":"6113d21a389d355a543441146f4850a74a219c9b51229d74f630ef6722366592"} Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.755094 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.755137 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.755148 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerStarted","Data":"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4"} Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.776336 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.776312737 podStartE2EDuration="2.776312737s" podCreationTimestamp="2026-01-27 11:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:35.766212148 +0000 UTC m=+1214.907809915" watchObservedRunningTime="2026-01-27 11:40:35.776312737 +0000 UTC m=+1214.917910514" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.943191 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-dvccn"] Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.945310 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:35 crc kubenswrapper[4775]: I0127 11:40:35.979083 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-dvccn"] Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.068351 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.068403 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.068856 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-svc\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.068925 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.068975 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brv5f\" (UniqueName: \"kubernetes.io/projected/160a0f00-a19e-4522-b8ea-2a14f87906e9-kube-api-access-brv5f\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.068995 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-config\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.170082 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.170138 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.170235 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-svc\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.170265 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.170286 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brv5f\" (UniqueName: \"kubernetes.io/projected/160a0f00-a19e-4522-b8ea-2a14f87906e9-kube-api-access-brv5f\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.170310 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-config\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.171179 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.171190 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-config\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.171248 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.171844 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.172027 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-svc\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.191063 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brv5f\" (UniqueName: \"kubernetes.io/projected/160a0f00-a19e-4522-b8ea-2a14f87906e9-kube-api-access-brv5f\") pod \"dnsmasq-dns-5ddd577785-dvccn\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.311405 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.360797 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 27 11:40:36 crc kubenswrapper[4775]: I0127 11:40:36.821083 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-dvccn"] Jan 27 11:40:37 crc kubenswrapper[4775]: I0127 11:40:37.770843 4775 generic.go:334] "Generic (PLEG): container finished" podID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerID="86a1bff7b31394585d429293e2cf406a868ddfdf2d92e362c2ef607e10a9665a" exitCode=0 Jan 27 11:40:37 crc kubenswrapper[4775]: I0127 11:40:37.773614 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" event={"ID":"160a0f00-a19e-4522-b8ea-2a14f87906e9","Type":"ContainerDied","Data":"86a1bff7b31394585d429293e2cf406a868ddfdf2d92e362c2ef607e10a9665a"} Jan 27 11:40:37 crc kubenswrapper[4775]: I0127 11:40:37.773660 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" event={"ID":"160a0f00-a19e-4522-b8ea-2a14f87906e9","Type":"ContainerStarted","Data":"a34cf5c231353408ee47634ef10ee450bdbb3cc3b1d50b38665b4fa21e3b0692"} Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.312029 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.417944 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.780646 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" event={"ID":"160a0f00-a19e-4522-b8ea-2a14f87906e9","Type":"ContainerStarted","Data":"cceb38c9f507e6c4fd34c4cca53a771be807a04a895235a4301c6341b1fac77c"} Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.781803 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.791537 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-log" containerID="cri-o://e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07" gracePeriod=30 Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.792278 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerStarted","Data":"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1"} Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.792310 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.792358 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-api" containerID="cri-o://3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a" gracePeriod=30 Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.814845 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" podStartSLOduration=3.814831951 podStartE2EDuration="3.814831951s" podCreationTimestamp="2026-01-27 11:40:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:38.802637923 +0000 UTC m=+1217.944235700" watchObservedRunningTime="2026-01-27 11:40:38.814831951 +0000 UTC m=+1217.956429728" Jan 27 11:40:38 crc kubenswrapper[4775]: I0127 11:40:38.835008 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.811908766 podStartE2EDuration="9.834987778s" podCreationTimestamp="2026-01-27 11:40:29 +0000 UTC" firstStartedPulling="2026-01-27 11:40:30.599363722 +0000 UTC m=+1209.740961499" lastFinishedPulling="2026-01-27 11:40:37.622442734 +0000 UTC m=+1216.764040511" observedRunningTime="2026-01-27 11:40:38.832080078 +0000 UTC m=+1217.973677855" watchObservedRunningTime="2026-01-27 11:40:38.834987778 +0000 UTC m=+1217.976585555" Jan 27 11:40:39 crc kubenswrapper[4775]: I0127 11:40:39.134152 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:39 crc kubenswrapper[4775]: I0127 11:40:39.801364 4775 generic.go:334] "Generic (PLEG): container finished" podID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerID="e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07" exitCode=143 Jan 27 11:40:39 crc kubenswrapper[4775]: I0127 11:40:39.801460 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3bcd5d9-85e4-4754-b39f-17ee05c9991e","Type":"ContainerDied","Data":"e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07"} Jan 27 11:40:39 crc kubenswrapper[4775]: I0127 11:40:39.802016 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-central-agent" containerID="cri-o://092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" gracePeriod=30 Jan 27 11:40:39 crc kubenswrapper[4775]: I0127 11:40:39.802039 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="proxy-httpd" containerID="cri-o://d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" gracePeriod=30 Jan 27 11:40:39 crc kubenswrapper[4775]: I0127 11:40:39.802107 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="sg-core" containerID="cri-o://4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" gracePeriod=30 Jan 27 11:40:39 crc kubenswrapper[4775]: I0127 11:40:39.802138 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-notification-agent" containerID="cri-o://3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" gracePeriod=30 Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.619913 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.765305 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-combined-ca-bundle\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.765697 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-run-httpd\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.765761 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-log-httpd\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.765913 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-sg-core-conf-yaml\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.766001 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-ceilometer-tls-certs\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.766058 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.766085 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-config-data\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.766233 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnhfs\" (UniqueName: \"kubernetes.io/projected/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-kube-api-access-dnhfs\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.766270 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-scripts\") pod \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\" (UID: \"18e2cdf2-0f89-471a-b0b4-a6437dfc7428\") " Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.766669 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.767072 4775 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.767093 4775 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.771910 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-kube-api-access-dnhfs" (OuterVolumeSpecName: "kube-api-access-dnhfs") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "kube-api-access-dnhfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.772218 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-scripts" (OuterVolumeSpecName: "scripts") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.800293 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.823117 4775 generic.go:334] "Generic (PLEG): container finished" podID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerID="d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" exitCode=0 Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.823158 4775 generic.go:334] "Generic (PLEG): container finished" podID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerID="4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" exitCode=2 Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.823169 4775 generic.go:334] "Generic (PLEG): container finished" podID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerID="3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" exitCode=0 Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.823177 4775 generic.go:334] "Generic (PLEG): container finished" podID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerID="092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" exitCode=0 Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.824488 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.825035 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerDied","Data":"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1"} Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.825072 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerDied","Data":"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4"} Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.825086 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerDied","Data":"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b"} Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.825097 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerDied","Data":"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d"} Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.825111 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"18e2cdf2-0f89-471a-b0b4-a6437dfc7428","Type":"ContainerDied","Data":"2720de5b9b26ed69e8ff5d7378ac05437110fdc5c5ce3bf22870368966b52b2b"} Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.825128 4775 scope.go:117] "RemoveContainer" containerID="d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.828661 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.843402 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.850627 4775 scope.go:117] "RemoveContainer" containerID="4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.875520 4775 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.875552 4775 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.875562 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnhfs\" (UniqueName: \"kubernetes.io/projected/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-kube-api-access-dnhfs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.875573 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.875582 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.877750 4775 scope.go:117] "RemoveContainer" containerID="3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.885107 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-config-data" (OuterVolumeSpecName: "config-data") pod "18e2cdf2-0f89-471a-b0b4-a6437dfc7428" (UID: "18e2cdf2-0f89-471a-b0b4-a6437dfc7428"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.900814 4775 scope.go:117] "RemoveContainer" containerID="092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.918724 4775 scope.go:117] "RemoveContainer" containerID="d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" Jan 27 11:40:40 crc kubenswrapper[4775]: E0127 11:40:40.919131 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": container with ID starting with d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1 not found: ID does not exist" containerID="d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919183 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1"} err="failed to get container status \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": rpc error: code = NotFound desc = could not find container \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": container with ID starting with d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919203 4775 scope.go:117] "RemoveContainer" containerID="4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" Jan 27 11:40:40 crc kubenswrapper[4775]: E0127 11:40:40.919429 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": container with ID starting with 4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4 not found: ID does not exist" containerID="4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919482 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4"} err="failed to get container status \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": rpc error: code = NotFound desc = could not find container \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": container with ID starting with 4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919496 4775 scope.go:117] "RemoveContainer" containerID="3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" Jan 27 11:40:40 crc kubenswrapper[4775]: E0127 11:40:40.919680 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": container with ID starting with 3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b not found: ID does not exist" containerID="3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919699 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b"} err="failed to get container status \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": rpc error: code = NotFound desc = could not find container \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": container with ID starting with 3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919712 4775 scope.go:117] "RemoveContainer" containerID="092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" Jan 27 11:40:40 crc kubenswrapper[4775]: E0127 11:40:40.919840 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": container with ID starting with 092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d not found: ID does not exist" containerID="092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919854 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d"} err="failed to get container status \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": rpc error: code = NotFound desc = could not find container \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": container with ID starting with 092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919866 4775 scope.go:117] "RemoveContainer" containerID="d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.919998 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1"} err="failed to get container status \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": rpc error: code = NotFound desc = could not find container \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": container with ID starting with d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.920012 4775 scope.go:117] "RemoveContainer" containerID="4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.920189 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4"} err="failed to get container status \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": rpc error: code = NotFound desc = could not find container \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": container with ID starting with 4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.920204 4775 scope.go:117] "RemoveContainer" containerID="3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.920358 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b"} err="failed to get container status \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": rpc error: code = NotFound desc = could not find container \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": container with ID starting with 3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.920377 4775 scope.go:117] "RemoveContainer" containerID="092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.920823 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d"} err="failed to get container status \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": rpc error: code = NotFound desc = could not find container \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": container with ID starting with 092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.920856 4775 scope.go:117] "RemoveContainer" containerID="d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.921106 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1"} err="failed to get container status \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": rpc error: code = NotFound desc = could not find container \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": container with ID starting with d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.921127 4775 scope.go:117] "RemoveContainer" containerID="4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.922160 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4"} err="failed to get container status \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": rpc error: code = NotFound desc = could not find container \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": container with ID starting with 4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.922193 4775 scope.go:117] "RemoveContainer" containerID="3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.922432 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b"} err="failed to get container status \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": rpc error: code = NotFound desc = could not find container \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": container with ID starting with 3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.922492 4775 scope.go:117] "RemoveContainer" containerID="092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.922698 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d"} err="failed to get container status \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": rpc error: code = NotFound desc = could not find container \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": container with ID starting with 092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.922719 4775 scope.go:117] "RemoveContainer" containerID="d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.923006 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1"} err="failed to get container status \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": rpc error: code = NotFound desc = could not find container \"d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1\": container with ID starting with d0ef0d57ef2cb01a9ed78c9705351daf719a763aafe6bbfc18ad594dccb6ecc1 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.923031 4775 scope.go:117] "RemoveContainer" containerID="4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.923236 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4"} err="failed to get container status \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": rpc error: code = NotFound desc = could not find container \"4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4\": container with ID starting with 4e2cb0362bb59c3a86c6a11b7b2d821da1763dec9004ccc0cbca324929b15ba4 not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.923255 4775 scope.go:117] "RemoveContainer" containerID="3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.923413 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b"} err="failed to get container status \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": rpc error: code = NotFound desc = could not find container \"3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b\": container with ID starting with 3aa351457dbf3628ba0668a1c569333b1fe69944d8cc148fa0a02c6837d1c63b not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.923433 4775 scope.go:117] "RemoveContainer" containerID="092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.924338 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d"} err="failed to get container status \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": rpc error: code = NotFound desc = could not find container \"092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d\": container with ID starting with 092e67118e8e16fc333aa43dad0c8b377fd9b63b8ed1e3f610efe4f8cf4ea83d not found: ID does not exist" Jan 27 11:40:40 crc kubenswrapper[4775]: I0127 11:40:40.977992 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/18e2cdf2-0f89-471a-b0b4-a6437dfc7428-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.160533 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.171775 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.187529 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:41 crc kubenswrapper[4775]: E0127 11:40:41.188248 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="sg-core" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.188373 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="sg-core" Jan 27 11:40:41 crc kubenswrapper[4775]: E0127 11:40:41.188479 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-central-agent" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.188552 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-central-agent" Jan 27 11:40:41 crc kubenswrapper[4775]: E0127 11:40:41.188653 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-notification-agent" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.188754 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-notification-agent" Jan 27 11:40:41 crc kubenswrapper[4775]: E0127 11:40:41.188831 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="proxy-httpd" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.188902 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="proxy-httpd" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.189176 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="proxy-httpd" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.189267 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-central-agent" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.189354 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="sg-core" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.189428 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" containerName="ceilometer-notification-agent" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.191694 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.194333 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.194709 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.194836 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.196909 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288509 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288556 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0fb6dfd-0694-418a-965e-789707762ef7-log-httpd\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288589 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r2z2\" (UniqueName: \"kubernetes.io/projected/f0fb6dfd-0694-418a-965e-789707762ef7-kube-api-access-5r2z2\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288620 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288661 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-scripts\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288682 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288734 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-config-data\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.288753 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0fb6dfd-0694-418a-965e-789707762ef7-run-httpd\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.390426 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-scripts\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.390656 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.390790 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-config-data\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.390826 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0fb6dfd-0694-418a-965e-789707762ef7-run-httpd\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.390939 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.390976 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0fb6dfd-0694-418a-965e-789707762ef7-log-httpd\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.391034 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r2z2\" (UniqueName: \"kubernetes.io/projected/f0fb6dfd-0694-418a-965e-789707762ef7-kube-api-access-5r2z2\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.391628 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.392302 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0fb6dfd-0694-418a-965e-789707762ef7-run-httpd\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.392323 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0fb6dfd-0694-418a-965e-789707762ef7-log-httpd\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.400414 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-scripts\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.400825 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.401345 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-config-data\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.406875 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.407596 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0fb6dfd-0694-418a-965e-789707762ef7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.414837 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r2z2\" (UniqueName: \"kubernetes.io/projected/f0fb6dfd-0694-418a-965e-789707762ef7-kube-api-access-5r2z2\") pod \"ceilometer-0\" (UID: \"f0fb6dfd-0694-418a-965e-789707762ef7\") " pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.512657 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.763305 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e2cdf2-0f89-471a-b0b4-a6437dfc7428" path="/var/lib/kubelet/pods/18e2cdf2-0f89-471a-b0b4-a6437dfc7428/volumes" Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.960661 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 27 11:40:41 crc kubenswrapper[4775]: W0127 11:40:41.966876 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0fb6dfd_0694_418a_965e_789707762ef7.slice/crio-00fa37c4b19ad035bf15e2caeaf7dca2a35d6e525feaeaad8a2378fac6e0e19e WatchSource:0}: Error finding container 00fa37c4b19ad035bf15e2caeaf7dca2a35d6e525feaeaad8a2378fac6e0e19e: Status 404 returned error can't find the container with id 00fa37c4b19ad035bf15e2caeaf7dca2a35d6e525feaeaad8a2378fac6e0e19e Jan 27 11:40:41 crc kubenswrapper[4775]: I0127 11:40:41.969437 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.425250 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.515687 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzc4q\" (UniqueName: \"kubernetes.io/projected/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-kube-api-access-zzc4q\") pod \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.515757 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-config-data\") pod \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.515965 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-combined-ca-bundle\") pod \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.516111 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-logs\") pod \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\" (UID: \"d3bcd5d9-85e4-4754-b39f-17ee05c9991e\") " Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.517259 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-logs" (OuterVolumeSpecName: "logs") pod "d3bcd5d9-85e4-4754-b39f-17ee05c9991e" (UID: "d3bcd5d9-85e4-4754-b39f-17ee05c9991e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.523581 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-kube-api-access-zzc4q" (OuterVolumeSpecName: "kube-api-access-zzc4q") pod "d3bcd5d9-85e4-4754-b39f-17ee05c9991e" (UID: "d3bcd5d9-85e4-4754-b39f-17ee05c9991e"). InnerVolumeSpecName "kube-api-access-zzc4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.562817 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3bcd5d9-85e4-4754-b39f-17ee05c9991e" (UID: "d3bcd5d9-85e4-4754-b39f-17ee05c9991e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.568073 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-config-data" (OuterVolumeSpecName: "config-data") pod "d3bcd5d9-85e4-4754-b39f-17ee05c9991e" (UID: "d3bcd5d9-85e4-4754-b39f-17ee05c9991e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.618367 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.618435 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.618478 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzc4q\" (UniqueName: \"kubernetes.io/projected/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-kube-api-access-zzc4q\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.618491 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3bcd5d9-85e4-4754-b39f-17ee05c9991e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.843415 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerStarted","Data":"30428215fd25f2d293050de6aefc5e00ce0f54513b74c8c39065ab59e8f5dfd5"} Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.843471 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerStarted","Data":"00fa37c4b19ad035bf15e2caeaf7dca2a35d6e525feaeaad8a2378fac6e0e19e"} Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.845153 4775 generic.go:334] "Generic (PLEG): container finished" podID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerID="3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a" exitCode=0 Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.845184 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3bcd5d9-85e4-4754-b39f-17ee05c9991e","Type":"ContainerDied","Data":"3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a"} Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.845203 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d3bcd5d9-85e4-4754-b39f-17ee05c9991e","Type":"ContainerDied","Data":"e5afc5e5b5021351beb662aa89d06ff2baa448336af2777de4b79215ed4b77f0"} Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.845213 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.845223 4775 scope.go:117] "RemoveContainer" containerID="3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.869706 4775 scope.go:117] "RemoveContainer" containerID="e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.883082 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.899256 4775 scope.go:117] "RemoveContainer" containerID="3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a" Jan 27 11:40:42 crc kubenswrapper[4775]: E0127 11:40:42.900409 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a\": container with ID starting with 3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a not found: ID does not exist" containerID="3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.900443 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a"} err="failed to get container status \"3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a\": rpc error: code = NotFound desc = could not find container \"3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a\": container with ID starting with 3572b1dddc3339c93ce7b2a670ab6c22d6510d4f5f7358b8186b1e76f2d2d18a not found: ID does not exist" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.900496 4775 scope.go:117] "RemoveContainer" containerID="e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07" Jan 27 11:40:42 crc kubenswrapper[4775]: E0127 11:40:42.900778 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07\": container with ID starting with e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07 not found: ID does not exist" containerID="e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.900816 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07"} err="failed to get container status \"e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07\": rpc error: code = NotFound desc = could not find container \"e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07\": container with ID starting with e7f0fca3202ad4597f48c17721fad00a70200aa8829bd61576aa753285ed1c07 not found: ID does not exist" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.907077 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.920849 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:42 crc kubenswrapper[4775]: E0127 11:40:42.921243 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-api" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.921260 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-api" Jan 27 11:40:42 crc kubenswrapper[4775]: E0127 11:40:42.921273 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-log" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.921280 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-log" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.921479 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-log" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.921496 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" containerName="nova-api-api" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.922373 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.928067 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.928282 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.928429 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 27 11:40:42 crc kubenswrapper[4775]: I0127 11:40:42.933061 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.025334 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.025395 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4deda10-439e-4a94-b215-968b1f49a1f7-logs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.025482 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9zsf\" (UniqueName: \"kubernetes.io/projected/f4deda10-439e-4a94-b215-968b1f49a1f7-kube-api-access-z9zsf\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.025534 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-public-tls-certs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.025556 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-config-data\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.025572 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.127370 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.127426 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4deda10-439e-4a94-b215-968b1f49a1f7-logs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.127492 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9zsf\" (UniqueName: \"kubernetes.io/projected/f4deda10-439e-4a94-b215-968b1f49a1f7-kube-api-access-z9zsf\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.127543 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-public-tls-certs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.127567 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-config-data\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.127581 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.128748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4deda10-439e-4a94-b215-968b1f49a1f7-logs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.135926 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.136021 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-public-tls-certs\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.139237 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-config-data\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.140079 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.145190 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9zsf\" (UniqueName: \"kubernetes.io/projected/f4deda10-439e-4a94-b215-968b1f49a1f7-kube-api-access-z9zsf\") pod \"nova-api-0\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.258653 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.789769 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3bcd5d9-85e4-4754-b39f-17ee05c9991e" path="/var/lib/kubelet/pods/d3bcd5d9-85e4-4754-b39f-17ee05c9991e/volumes" Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.795392 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.854465 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4deda10-439e-4a94-b215-968b1f49a1f7","Type":"ContainerStarted","Data":"e9b280e81c9b3e2f27e2a003b985657f4d413b9e18eca2cc53bed8cbd3cdcb27"} Jan 27 11:40:43 crc kubenswrapper[4775]: I0127 11:40:43.856248 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerStarted","Data":"aea3181cf116bae455f41b1366597b119efc1371f74ffae26f9a4168156cbb13"} Jan 27 11:40:44 crc kubenswrapper[4775]: I0127 11:40:44.134734 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:44 crc kubenswrapper[4775]: I0127 11:40:44.150482 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:44 crc kubenswrapper[4775]: I0127 11:40:44.868618 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerStarted","Data":"3ce959ac65992ddc8dbe0e5dc438cf5975b410ccc59cc1574279bcf41dba9159"} Jan 27 11:40:44 crc kubenswrapper[4775]: I0127 11:40:44.870966 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4deda10-439e-4a94-b215-968b1f49a1f7","Type":"ContainerStarted","Data":"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80"} Jan 27 11:40:44 crc kubenswrapper[4775]: I0127 11:40:44.871019 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4deda10-439e-4a94-b215-968b1f49a1f7","Type":"ContainerStarted","Data":"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588"} Jan 27 11:40:44 crc kubenswrapper[4775]: I0127 11:40:44.901381 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.901362281 podStartE2EDuration="2.901362281s" podCreationTimestamp="2026-01-27 11:40:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:44.892689531 +0000 UTC m=+1224.034287318" watchObservedRunningTime="2026-01-27 11:40:44.901362281 +0000 UTC m=+1224.042960058" Jan 27 11:40:44 crc kubenswrapper[4775]: I0127 11:40:44.934623 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.133679 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-4lnkz"] Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.134712 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.137052 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.137075 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.147605 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4lnkz"] Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.263344 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8f8l\" (UniqueName: \"kubernetes.io/projected/b77cbe7c-5901-44d2-959f-5435b8adbc85-kube-api-access-b8f8l\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.263414 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-config-data\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.263698 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.263782 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-scripts\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.366179 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8f8l\" (UniqueName: \"kubernetes.io/projected/b77cbe7c-5901-44d2-959f-5435b8adbc85-kube-api-access-b8f8l\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.366261 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-config-data\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.366362 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.366389 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-scripts\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.372097 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-config-data\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.376155 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-scripts\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.385715 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.387131 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8f8l\" (UniqueName: \"kubernetes.io/projected/b77cbe7c-5901-44d2-959f-5435b8adbc85-kube-api-access-b8f8l\") pod \"nova-cell1-cell-mapping-4lnkz\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.449988 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.881388 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerStarted","Data":"6a28a6bfae3dfae0c75190ed63d0170da7178a0b262c8aee08135af71c93d6d7"} Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.906800 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.5055357520000001 podStartE2EDuration="4.906768279s" podCreationTimestamp="2026-01-27 11:40:41 +0000 UTC" firstStartedPulling="2026-01-27 11:40:41.969192342 +0000 UTC m=+1221.110790119" lastFinishedPulling="2026-01-27 11:40:45.370424869 +0000 UTC m=+1224.512022646" observedRunningTime="2026-01-27 11:40:45.906522283 +0000 UTC m=+1225.048120070" watchObservedRunningTime="2026-01-27 11:40:45.906768279 +0000 UTC m=+1225.048366056" Jan 27 11:40:45 crc kubenswrapper[4775]: W0127 11:40:45.955581 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb77cbe7c_5901_44d2_959f_5435b8adbc85.slice/crio-a7dcc3334ba43ae339002d5e9fb45a9710f077d3e66f6c2787cce66bc2959ae3 WatchSource:0}: Error finding container a7dcc3334ba43ae339002d5e9fb45a9710f077d3e66f6c2787cce66bc2959ae3: Status 404 returned error can't find the container with id a7dcc3334ba43ae339002d5e9fb45a9710f077d3e66f6c2787cce66bc2959ae3 Jan 27 11:40:45 crc kubenswrapper[4775]: I0127 11:40:45.956622 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-4lnkz"] Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.313760 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.391770 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9zwtc"] Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.392110 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" podUID="e869df3d-c15b-4610-bb78-00ad49940d17" containerName="dnsmasq-dns" containerID="cri-o://ada66549c4f1e296080bb921b685b5ff52027670033c232a5715f71a31d45760" gracePeriod=10 Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.905114 4775 generic.go:334] "Generic (PLEG): container finished" podID="e869df3d-c15b-4610-bb78-00ad49940d17" containerID="ada66549c4f1e296080bb921b685b5ff52027670033c232a5715f71a31d45760" exitCode=0 Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.905486 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" event={"ID":"e869df3d-c15b-4610-bb78-00ad49940d17","Type":"ContainerDied","Data":"ada66549c4f1e296080bb921b685b5ff52027670033c232a5715f71a31d45760"} Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.905517 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" event={"ID":"e869df3d-c15b-4610-bb78-00ad49940d17","Type":"ContainerDied","Data":"52b60424b8cf5ff2ff4e842e6b10a8198ed82ddf2c9b7ee1ba530ecf8959634e"} Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.905528 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52b60424b8cf5ff2ff4e842e6b10a8198ed82ddf2c9b7ee1ba530ecf8959634e" Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.908468 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4lnkz" event={"ID":"b77cbe7c-5901-44d2-959f-5435b8adbc85","Type":"ContainerStarted","Data":"fee7236fa11e516e48176ea4ac10ecf99f92b8a3df878c241be649e46d2bcbab"} Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.908520 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4lnkz" event={"ID":"b77cbe7c-5901-44d2-959f-5435b8adbc85","Type":"ContainerStarted","Data":"a7dcc3334ba43ae339002d5e9fb45a9710f077d3e66f6c2787cce66bc2959ae3"} Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.908766 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.925761 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-4lnkz" podStartSLOduration=1.9257450839999999 podStartE2EDuration="1.925745084s" podCreationTimestamp="2026-01-27 11:40:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:46.924956562 +0000 UTC m=+1226.066554339" watchObservedRunningTime="2026-01-27 11:40:46.925745084 +0000 UTC m=+1226.067342861" Jan 27 11:40:46 crc kubenswrapper[4775]: I0127 11:40:46.928329 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.024770 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw444\" (UniqueName: \"kubernetes.io/projected/e869df3d-c15b-4610-bb78-00ad49940d17-kube-api-access-qw444\") pod \"e869df3d-c15b-4610-bb78-00ad49940d17\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.024850 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-swift-storage-0\") pod \"e869df3d-c15b-4610-bb78-00ad49940d17\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.024884 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-svc\") pod \"e869df3d-c15b-4610-bb78-00ad49940d17\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.024945 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-sb\") pod \"e869df3d-c15b-4610-bb78-00ad49940d17\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.025021 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-config\") pod \"e869df3d-c15b-4610-bb78-00ad49940d17\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.025116 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-nb\") pod \"e869df3d-c15b-4610-bb78-00ad49940d17\" (UID: \"e869df3d-c15b-4610-bb78-00ad49940d17\") " Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.036962 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e869df3d-c15b-4610-bb78-00ad49940d17-kube-api-access-qw444" (OuterVolumeSpecName: "kube-api-access-qw444") pod "e869df3d-c15b-4610-bb78-00ad49940d17" (UID: "e869df3d-c15b-4610-bb78-00ad49940d17"). InnerVolumeSpecName "kube-api-access-qw444". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.079105 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e869df3d-c15b-4610-bb78-00ad49940d17" (UID: "e869df3d-c15b-4610-bb78-00ad49940d17"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.083619 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e869df3d-c15b-4610-bb78-00ad49940d17" (UID: "e869df3d-c15b-4610-bb78-00ad49940d17"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.084703 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e869df3d-c15b-4610-bb78-00ad49940d17" (UID: "e869df3d-c15b-4610-bb78-00ad49940d17"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.102065 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-config" (OuterVolumeSpecName: "config") pod "e869df3d-c15b-4610-bb78-00ad49940d17" (UID: "e869df3d-c15b-4610-bb78-00ad49940d17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.127035 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.127073 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.127083 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.127094 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qw444\" (UniqueName: \"kubernetes.io/projected/e869df3d-c15b-4610-bb78-00ad49940d17-kube-api-access-qw444\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.127104 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.130352 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e869df3d-c15b-4610-bb78-00ad49940d17" (UID: "e869df3d-c15b-4610-bb78-00ad49940d17"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.228809 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e869df3d-c15b-4610-bb78-00ad49940d17-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.915413 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-557bbc7df7-9zwtc" Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.935138 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9zwtc"] Jan 27 11:40:47 crc kubenswrapper[4775]: I0127 11:40:47.946183 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-557bbc7df7-9zwtc"] Jan 27 11:40:49 crc kubenswrapper[4775]: I0127 11:40:49.758376 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e869df3d-c15b-4610-bb78-00ad49940d17" path="/var/lib/kubelet/pods/e869df3d-c15b-4610-bb78-00ad49940d17/volumes" Jan 27 11:40:50 crc kubenswrapper[4775]: I0127 11:40:50.944068 4775 generic.go:334] "Generic (PLEG): container finished" podID="b77cbe7c-5901-44d2-959f-5435b8adbc85" containerID="fee7236fa11e516e48176ea4ac10ecf99f92b8a3df878c241be649e46d2bcbab" exitCode=0 Jan 27 11:40:50 crc kubenswrapper[4775]: I0127 11:40:50.944184 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4lnkz" event={"ID":"b77cbe7c-5901-44d2-959f-5435b8adbc85","Type":"ContainerDied","Data":"fee7236fa11e516e48176ea4ac10ecf99f92b8a3df878c241be649e46d2bcbab"} Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.496548 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.572024 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-config-data\") pod \"b77cbe7c-5901-44d2-959f-5435b8adbc85\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.572428 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-combined-ca-bundle\") pod \"b77cbe7c-5901-44d2-959f-5435b8adbc85\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.572661 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-scripts\") pod \"b77cbe7c-5901-44d2-959f-5435b8adbc85\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.572719 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8f8l\" (UniqueName: \"kubernetes.io/projected/b77cbe7c-5901-44d2-959f-5435b8adbc85-kube-api-access-b8f8l\") pod \"b77cbe7c-5901-44d2-959f-5435b8adbc85\" (UID: \"b77cbe7c-5901-44d2-959f-5435b8adbc85\") " Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.577702 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b77cbe7c-5901-44d2-959f-5435b8adbc85-kube-api-access-b8f8l" (OuterVolumeSpecName: "kube-api-access-b8f8l") pod "b77cbe7c-5901-44d2-959f-5435b8adbc85" (UID: "b77cbe7c-5901-44d2-959f-5435b8adbc85"). InnerVolumeSpecName "kube-api-access-b8f8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.588635 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-scripts" (OuterVolumeSpecName: "scripts") pod "b77cbe7c-5901-44d2-959f-5435b8adbc85" (UID: "b77cbe7c-5901-44d2-959f-5435b8adbc85"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.605018 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b77cbe7c-5901-44d2-959f-5435b8adbc85" (UID: "b77cbe7c-5901-44d2-959f-5435b8adbc85"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.617484 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-config-data" (OuterVolumeSpecName: "config-data") pod "b77cbe7c-5901-44d2-959f-5435b8adbc85" (UID: "b77cbe7c-5901-44d2-959f-5435b8adbc85"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.673826 4775 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-scripts\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.673859 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8f8l\" (UniqueName: \"kubernetes.io/projected/b77cbe7c-5901-44d2-959f-5435b8adbc85-kube-api-access-b8f8l\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.673871 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.673880 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b77cbe7c-5901-44d2-959f-5435b8adbc85-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.968508 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-4lnkz" event={"ID":"b77cbe7c-5901-44d2-959f-5435b8adbc85","Type":"ContainerDied","Data":"a7dcc3334ba43ae339002d5e9fb45a9710f077d3e66f6c2787cce66bc2959ae3"} Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.968542 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7dcc3334ba43ae339002d5e9fb45a9710f077d3e66f6c2787cce66bc2959ae3" Jan 27 11:40:52 crc kubenswrapper[4775]: I0127 11:40:52.968591 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-4lnkz" Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.185313 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.185864 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-log" containerID="cri-o://cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588" gracePeriod=30 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.185995 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-api" containerID="cri-o://bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80" gracePeriod=30 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.207088 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.207616 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b76fecdf-e253-454b-8e4e-4c9109834188" containerName="nova-scheduler-scheduler" containerID="cri-o://5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" gracePeriod=30 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.219496 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.219725 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-log" containerID="cri-o://8afc04127ae5dac867cf7f5463a37db08396e7d83dca005132a5f83a2ea9896d" gracePeriod=30 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.219844 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-metadata" containerID="cri-o://b680860e2593d7ee3bb455ce65bb0c417d6d9c265106d69c11a3f6d5c337e06f" gracePeriod=30 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.810426 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.977006 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerID="bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80" exitCode=0 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.977035 4775 generic.go:334] "Generic (PLEG): container finished" podID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerID="cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588" exitCode=143 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.977051 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4deda10-439e-4a94-b215-968b1f49a1f7","Type":"ContainerDied","Data":"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80"} Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.977036 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.977078 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4deda10-439e-4a94-b215-968b1f49a1f7","Type":"ContainerDied","Data":"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588"} Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.977102 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f4deda10-439e-4a94-b215-968b1f49a1f7","Type":"ContainerDied","Data":"e9b280e81c9b3e2f27e2a003b985657f4d413b9e18eca2cc53bed8cbd3cdcb27"} Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.977116 4775 scope.go:117] "RemoveContainer" containerID="bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80" Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.984299 4775 generic.go:334] "Generic (PLEG): container finished" podID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerID="8afc04127ae5dac867cf7f5463a37db08396e7d83dca005132a5f83a2ea9896d" exitCode=143 Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.984329 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7916937d-e997-4d88-8a6b-9fecf57f6828","Type":"ContainerDied","Data":"8afc04127ae5dac867cf7f5463a37db08396e7d83dca005132a5f83a2ea9896d"} Jan 27 11:40:53 crc kubenswrapper[4775]: I0127 11:40:53.997288 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-combined-ca-bundle\") pod \"f4deda10-439e-4a94-b215-968b1f49a1f7\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:53.999994 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-internal-tls-certs\") pod \"f4deda10-439e-4a94-b215-968b1f49a1f7\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.000072 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-config-data\") pod \"f4deda10-439e-4a94-b215-968b1f49a1f7\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.000140 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4deda10-439e-4a94-b215-968b1f49a1f7-logs\") pod \"f4deda10-439e-4a94-b215-968b1f49a1f7\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.000211 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9zsf\" (UniqueName: \"kubernetes.io/projected/f4deda10-439e-4a94-b215-968b1f49a1f7-kube-api-access-z9zsf\") pod \"f4deda10-439e-4a94-b215-968b1f49a1f7\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.000252 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-public-tls-certs\") pod \"f4deda10-439e-4a94-b215-968b1f49a1f7\" (UID: \"f4deda10-439e-4a94-b215-968b1f49a1f7\") " Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.000623 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4deda10-439e-4a94-b215-968b1f49a1f7-logs" (OuterVolumeSpecName: "logs") pod "f4deda10-439e-4a94-b215-968b1f49a1f7" (UID: "f4deda10-439e-4a94-b215-968b1f49a1f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.001392 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f4deda10-439e-4a94-b215-968b1f49a1f7-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.005703 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4deda10-439e-4a94-b215-968b1f49a1f7-kube-api-access-z9zsf" (OuterVolumeSpecName: "kube-api-access-z9zsf") pod "f4deda10-439e-4a94-b215-968b1f49a1f7" (UID: "f4deda10-439e-4a94-b215-968b1f49a1f7"). InnerVolumeSpecName "kube-api-access-z9zsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.009674 4775 scope.go:117] "RemoveContainer" containerID="cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.030419 4775 scope.go:117] "RemoveContainer" containerID="bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80" Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.032418 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80\": container with ID starting with bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80 not found: ID does not exist" containerID="bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.032474 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80"} err="failed to get container status \"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80\": rpc error: code = NotFound desc = could not find container \"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80\": container with ID starting with bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80 not found: ID does not exist" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.032497 4775 scope.go:117] "RemoveContainer" containerID="cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.033250 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4deda10-439e-4a94-b215-968b1f49a1f7" (UID: "f4deda10-439e-4a94-b215-968b1f49a1f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.033277 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-config-data" (OuterVolumeSpecName: "config-data") pod "f4deda10-439e-4a94-b215-968b1f49a1f7" (UID: "f4deda10-439e-4a94-b215-968b1f49a1f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.034707 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588\": container with ID starting with cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588 not found: ID does not exist" containerID="cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.034748 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588"} err="failed to get container status \"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588\": rpc error: code = NotFound desc = could not find container \"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588\": container with ID starting with cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588 not found: ID does not exist" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.034765 4775 scope.go:117] "RemoveContainer" containerID="bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.035078 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80"} err="failed to get container status \"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80\": rpc error: code = NotFound desc = could not find container \"bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80\": container with ID starting with bac5cd941ec49755bcc4c06a7d27e52d40c2534385a431543c517493d50c9c80 not found: ID does not exist" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.035096 4775 scope.go:117] "RemoveContainer" containerID="cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.035487 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588"} err="failed to get container status \"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588\": rpc error: code = NotFound desc = could not find container \"cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588\": container with ID starting with cf9fd7d037b2ba1cf361bec080463244587e2c17f141c931eb8a37538a0e7588 not found: ID does not exist" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.061920 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f4deda10-439e-4a94-b215-968b1f49a1f7" (UID: "f4deda10-439e-4a94-b215-968b1f49a1f7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.061946 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f4deda10-439e-4a94-b215-968b1f49a1f7" (UID: "f4deda10-439e-4a94-b215-968b1f49a1f7"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.102580 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9zsf\" (UniqueName: \"kubernetes.io/projected/f4deda10-439e-4a94-b215-968b1f49a1f7-kube-api-access-z9zsf\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.102626 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.102635 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.102644 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.102653 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4deda10-439e-4a94-b215-968b1f49a1f7-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.332832 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.355963 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375210 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.375644 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e869df3d-c15b-4610-bb78-00ad49940d17" containerName="init" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375662 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e869df3d-c15b-4610-bb78-00ad49940d17" containerName="init" Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.375676 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-log" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375684 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-log" Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.375710 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e869df3d-c15b-4610-bb78-00ad49940d17" containerName="dnsmasq-dns" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375717 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e869df3d-c15b-4610-bb78-00ad49940d17" containerName="dnsmasq-dns" Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.375735 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-api" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375740 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-api" Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.375750 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b77cbe7c-5901-44d2-959f-5435b8adbc85" containerName="nova-manage" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375756 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b77cbe7c-5901-44d2-959f-5435b8adbc85" containerName="nova-manage" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375905 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-api" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375918 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" containerName="nova-api-log" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375932 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e869df3d-c15b-4610-bb78-00ad49940d17" containerName="dnsmasq-dns" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.375949 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b77cbe7c-5901-44d2-959f-5435b8adbc85" containerName="nova-manage" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.376971 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.384418 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.387962 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.388225 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.389232 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.407284 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.407375 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/724fa5b2-f306-42e9-8781-76a9166bd19e-logs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.407434 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48lz2\" (UniqueName: \"kubernetes.io/projected/724fa5b2-f306-42e9-8781-76a9166bd19e-kube-api-access-48lz2\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.407534 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-public-tls-certs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.407597 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-config-data\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.407668 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.509303 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.510365 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/724fa5b2-f306-42e9-8781-76a9166bd19e-logs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.510403 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48lz2\" (UniqueName: \"kubernetes.io/projected/724fa5b2-f306-42e9-8781-76a9166bd19e-kube-api-access-48lz2\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.510468 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-public-tls-certs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.510510 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-config-data\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.510547 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.511716 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/724fa5b2-f306-42e9-8781-76a9166bd19e-logs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.513987 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.515587 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.516490 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-public-tls-certs\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.517008 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-config-data\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.538147 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48lz2\" (UniqueName: \"kubernetes.io/projected/724fa5b2-f306-42e9-8781-76a9166bd19e-kube-api-access-48lz2\") pod \"nova-api-0\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: I0127 11:40:54.711353 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.973681 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.975193 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.976580 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 27 11:40:54 crc kubenswrapper[4775]: E0127 11:40:54.976612 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b76fecdf-e253-454b-8e4e-4c9109834188" containerName="nova-scheduler-scheduler" Jan 27 11:40:55 crc kubenswrapper[4775]: I0127 11:40:55.145525 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:40:55 crc kubenswrapper[4775]: W0127 11:40:55.155138 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod724fa5b2_f306_42e9_8781_76a9166bd19e.slice/crio-e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2 WatchSource:0}: Error finding container e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2: Status 404 returned error can't find the container with id e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2 Jan 27 11:40:55 crc kubenswrapper[4775]: I0127 11:40:55.755112 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4deda10-439e-4a94-b215-968b1f49a1f7" path="/var/lib/kubelet/pods/f4deda10-439e-4a94-b215-968b1f49a1f7/volumes" Jan 27 11:40:56 crc kubenswrapper[4775]: I0127 11:40:56.001856 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"724fa5b2-f306-42e9-8781-76a9166bd19e","Type":"ContainerStarted","Data":"3fb6dba1ef6aef5504b2fb4bb7d21e98e86e3a8d11057b678b01d97ea7febc53"} Jan 27 11:40:56 crc kubenswrapper[4775]: I0127 11:40:56.001896 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"724fa5b2-f306-42e9-8781-76a9166bd19e","Type":"ContainerStarted","Data":"680998a678e870e249e755477f30b2a4504f760bab8f79f38f76f47fa33c362f"} Jan 27 11:40:56 crc kubenswrapper[4775]: I0127 11:40:56.001906 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"724fa5b2-f306-42e9-8781-76a9166bd19e","Type":"ContainerStarted","Data":"e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2"} Jan 27 11:40:56 crc kubenswrapper[4775]: I0127 11:40:56.018639 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.018621641 podStartE2EDuration="2.018621641s" podCreationTimestamp="2026-01-27 11:40:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:40:56.016769851 +0000 UTC m=+1235.158367628" watchObservedRunningTime="2026-01-27 11:40:56.018621641 +0000 UTC m=+1235.160219418" Jan 27 11:40:56 crc kubenswrapper[4775]: I0127 11:40:56.670081 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": read tcp 10.217.0.2:55464->10.217.0.209:8775: read: connection reset by peer" Jan 27 11:40:56 crc kubenswrapper[4775]: I0127 11:40:56.670357 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.209:8775/\": read tcp 10.217.0.2:55474->10.217.0.209:8775: read: connection reset by peer" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.011377 4775 generic.go:334] "Generic (PLEG): container finished" podID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerID="b680860e2593d7ee3bb455ce65bb0c417d6d9c265106d69c11a3f6d5c337e06f" exitCode=0 Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.012266 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7916937d-e997-4d88-8a6b-9fecf57f6828","Type":"ContainerDied","Data":"b680860e2593d7ee3bb455ce65bb0c417d6d9c265106d69c11a3f6d5c337e06f"} Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.012303 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7916937d-e997-4d88-8a6b-9fecf57f6828","Type":"ContainerDied","Data":"659a755d8117f607fb5b143fa1ae054d06f0293125a832cbe5f099c7b00e97ab"} Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.012345 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="659a755d8117f607fb5b143fa1ae054d06f0293125a832cbe5f099c7b00e97ab" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.092298 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.159988 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-nova-metadata-tls-certs\") pod \"7916937d-e997-4d88-8a6b-9fecf57f6828\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.160076 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-config-data\") pod \"7916937d-e997-4d88-8a6b-9fecf57f6828\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.160125 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7swxn\" (UniqueName: \"kubernetes.io/projected/7916937d-e997-4d88-8a6b-9fecf57f6828-kube-api-access-7swxn\") pod \"7916937d-e997-4d88-8a6b-9fecf57f6828\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.160149 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7916937d-e997-4d88-8a6b-9fecf57f6828-logs\") pod \"7916937d-e997-4d88-8a6b-9fecf57f6828\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.160173 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-combined-ca-bundle\") pod \"7916937d-e997-4d88-8a6b-9fecf57f6828\" (UID: \"7916937d-e997-4d88-8a6b-9fecf57f6828\") " Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.160770 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7916937d-e997-4d88-8a6b-9fecf57f6828-logs" (OuterVolumeSpecName: "logs") pod "7916937d-e997-4d88-8a6b-9fecf57f6828" (UID: "7916937d-e997-4d88-8a6b-9fecf57f6828"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.168672 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7916937d-e997-4d88-8a6b-9fecf57f6828-kube-api-access-7swxn" (OuterVolumeSpecName: "kube-api-access-7swxn") pod "7916937d-e997-4d88-8a6b-9fecf57f6828" (UID: "7916937d-e997-4d88-8a6b-9fecf57f6828"). InnerVolumeSpecName "kube-api-access-7swxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.205956 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7916937d-e997-4d88-8a6b-9fecf57f6828" (UID: "7916937d-e997-4d88-8a6b-9fecf57f6828"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.214007 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-config-data" (OuterVolumeSpecName: "config-data") pod "7916937d-e997-4d88-8a6b-9fecf57f6828" (UID: "7916937d-e997-4d88-8a6b-9fecf57f6828"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.253226 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7916937d-e997-4d88-8a6b-9fecf57f6828" (UID: "7916937d-e997-4d88-8a6b-9fecf57f6828"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.262590 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.262619 4775 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.262629 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7916937d-e997-4d88-8a6b-9fecf57f6828-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.262636 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7swxn\" (UniqueName: \"kubernetes.io/projected/7916937d-e997-4d88-8a6b-9fecf57f6828-kube-api-access-7swxn\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:57 crc kubenswrapper[4775]: I0127 11:40:57.262644 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7916937d-e997-4d88-8a6b-9fecf57f6828-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.034190 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.063584 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.073019 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.088385 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:58 crc kubenswrapper[4775]: E0127 11:40:58.088879 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-log" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.088906 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-log" Jan 27 11:40:58 crc kubenswrapper[4775]: E0127 11:40:58.088939 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-metadata" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.088949 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-metadata" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.089155 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-log" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.089189 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" containerName="nova-metadata-metadata" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.090356 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.093249 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.094538 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.111258 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.178953 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfsp7\" (UniqueName: \"kubernetes.io/projected/3d743fc7-b5d1-4890-bc22-22de8227323e-kube-api-access-tfsp7\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.179092 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.179132 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.179209 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d743fc7-b5d1-4890-bc22-22de8227323e-logs\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.179255 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-config-data\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.280767 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-config-data\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.280884 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfsp7\" (UniqueName: \"kubernetes.io/projected/3d743fc7-b5d1-4890-bc22-22de8227323e-kube-api-access-tfsp7\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.280968 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.280991 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.281072 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d743fc7-b5d1-4890-bc22-22de8227323e-logs\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.281715 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d743fc7-b5d1-4890-bc22-22de8227323e-logs\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.285135 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.285784 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-config-data\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.286196 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.305119 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfsp7\" (UniqueName: \"kubernetes.io/projected/3d743fc7-b5d1-4890-bc22-22de8227323e-kube-api-access-tfsp7\") pod \"nova-metadata-0\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.419794 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.851560 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:40:58 crc kubenswrapper[4775]: I0127 11:40:58.994479 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.046043 4775 generic.go:334] "Generic (PLEG): container finished" podID="b76fecdf-e253-454b-8e4e-4c9109834188" containerID="5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" exitCode=0 Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.046101 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.046890 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b76fecdf-e253-454b-8e4e-4c9109834188","Type":"ContainerDied","Data":"5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e"} Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.046984 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b76fecdf-e253-454b-8e4e-4c9109834188","Type":"ContainerDied","Data":"c155607c7d182e5993c92c25a4eb742bffffa860a5fe816b477ae3783c2ec4bb"} Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.047042 4775 scope.go:117] "RemoveContainer" containerID="5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.047856 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d743fc7-b5d1-4890-bc22-22de8227323e","Type":"ContainerStarted","Data":"cc025fb49ac4acec92f7c01cfadfc52510ce9eff9a04601bf2e7f2c28847302c"} Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.077981 4775 scope.go:117] "RemoveContainer" containerID="5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" Jan 27 11:40:59 crc kubenswrapper[4775]: E0127 11:40:59.078780 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e\": container with ID starting with 5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e not found: ID does not exist" containerID="5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.078895 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e"} err="failed to get container status \"5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e\": rpc error: code = NotFound desc = could not find container \"5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e\": container with ID starting with 5badad70b317bef0996e0ff5b6d584943bb7c8bb6a6995892e76aaf432c0952e not found: ID does not exist" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.095986 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-combined-ca-bundle\") pod \"b76fecdf-e253-454b-8e4e-4c9109834188\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.096229 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-config-data\") pod \"b76fecdf-e253-454b-8e4e-4c9109834188\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.096418 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mmbm\" (UniqueName: \"kubernetes.io/projected/b76fecdf-e253-454b-8e4e-4c9109834188-kube-api-access-9mmbm\") pod \"b76fecdf-e253-454b-8e4e-4c9109834188\" (UID: \"b76fecdf-e253-454b-8e4e-4c9109834188\") " Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.102413 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b76fecdf-e253-454b-8e4e-4c9109834188-kube-api-access-9mmbm" (OuterVolumeSpecName: "kube-api-access-9mmbm") pod "b76fecdf-e253-454b-8e4e-4c9109834188" (UID: "b76fecdf-e253-454b-8e4e-4c9109834188"). InnerVolumeSpecName "kube-api-access-9mmbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.130229 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-config-data" (OuterVolumeSpecName: "config-data") pod "b76fecdf-e253-454b-8e4e-4c9109834188" (UID: "b76fecdf-e253-454b-8e4e-4c9109834188"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.131852 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b76fecdf-e253-454b-8e4e-4c9109834188" (UID: "b76fecdf-e253-454b-8e4e-4c9109834188"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.199051 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.199090 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mmbm\" (UniqueName: \"kubernetes.io/projected/b76fecdf-e253-454b-8e4e-4c9109834188-kube-api-access-9mmbm\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.199101 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b76fecdf-e253-454b-8e4e-4c9109834188-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.378676 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.389980 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.401435 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:59 crc kubenswrapper[4775]: E0127 11:40:59.401872 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b76fecdf-e253-454b-8e4e-4c9109834188" containerName="nova-scheduler-scheduler" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.401890 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b76fecdf-e253-454b-8e4e-4c9109834188" containerName="nova-scheduler-scheduler" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.402077 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b76fecdf-e253-454b-8e4e-4c9109834188" containerName="nova-scheduler-scheduler" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.402674 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.404121 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.412723 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.504400 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.504572 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-config-data\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.504629 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zw2s\" (UniqueName: \"kubernetes.io/projected/7600201f-fb6c-4eb7-8b0a-19078b93c131-kube-api-access-5zw2s\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.606572 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.606648 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-config-data\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.606679 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zw2s\" (UniqueName: \"kubernetes.io/projected/7600201f-fb6c-4eb7-8b0a-19078b93c131-kube-api-access-5zw2s\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.611748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-config-data\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.612594 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.623056 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zw2s\" (UniqueName: \"kubernetes.io/projected/7600201f-fb6c-4eb7-8b0a-19078b93c131-kube-api-access-5zw2s\") pod \"nova-scheduler-0\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.724104 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.772600 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7916937d-e997-4d88-8a6b-9fecf57f6828" path="/var/lib/kubelet/pods/7916937d-e997-4d88-8a6b-9fecf57f6828/volumes" Jan 27 11:40:59 crc kubenswrapper[4775]: I0127 11:40:59.773339 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b76fecdf-e253-454b-8e4e-4c9109834188" path="/var/lib/kubelet/pods/b76fecdf-e253-454b-8e4e-4c9109834188/volumes" Jan 27 11:41:00 crc kubenswrapper[4775]: I0127 11:41:00.059372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d743fc7-b5d1-4890-bc22-22de8227323e","Type":"ContainerStarted","Data":"75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712"} Jan 27 11:41:00 crc kubenswrapper[4775]: I0127 11:41:00.059424 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d743fc7-b5d1-4890-bc22-22de8227323e","Type":"ContainerStarted","Data":"c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81"} Jan 27 11:41:00 crc kubenswrapper[4775]: I0127 11:41:00.083248 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.083229336 podStartE2EDuration="2.083229336s" podCreationTimestamp="2026-01-27 11:40:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:00.081022917 +0000 UTC m=+1239.222620704" watchObservedRunningTime="2026-01-27 11:41:00.083229336 +0000 UTC m=+1239.224827103" Jan 27 11:41:00 crc kubenswrapper[4775]: I0127 11:41:00.183382 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:00 crc kubenswrapper[4775]: W0127 11:41:00.185413 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7600201f_fb6c_4eb7_8b0a_19078b93c131.slice/crio-2771d331e7612b69f7658eb7e84583cbbafb7ca66178894edd5683aab64a88ec WatchSource:0}: Error finding container 2771d331e7612b69f7658eb7e84583cbbafb7ca66178894edd5683aab64a88ec: Status 404 returned error can't find the container with id 2771d331e7612b69f7658eb7e84583cbbafb7ca66178894edd5683aab64a88ec Jan 27 11:41:01 crc kubenswrapper[4775]: I0127 11:41:01.070271 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7600201f-fb6c-4eb7-8b0a-19078b93c131","Type":"ContainerStarted","Data":"a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512"} Jan 27 11:41:01 crc kubenswrapper[4775]: I0127 11:41:01.070619 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7600201f-fb6c-4eb7-8b0a-19078b93c131","Type":"ContainerStarted","Data":"2771d331e7612b69f7658eb7e84583cbbafb7ca66178894edd5683aab64a88ec"} Jan 27 11:41:01 crc kubenswrapper[4775]: I0127 11:41:01.112303 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.112283286 podStartE2EDuration="2.112283286s" podCreationTimestamp="2026-01-27 11:40:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:01.096248637 +0000 UTC m=+1240.237846414" watchObservedRunningTime="2026-01-27 11:41:01.112283286 +0000 UTC m=+1240.253881063" Jan 27 11:41:03 crc kubenswrapper[4775]: I0127 11:41:03.420264 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:41:03 crc kubenswrapper[4775]: I0127 11:41:03.420496 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:41:04 crc kubenswrapper[4775]: I0127 11:41:04.712332 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:41:04 crc kubenswrapper[4775]: I0127 11:41:04.712695 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:41:04 crc kubenswrapper[4775]: I0127 11:41:04.725137 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 11:41:05 crc kubenswrapper[4775]: I0127 11:41:05.724611 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:05 crc kubenswrapper[4775]: I0127 11:41:05.724611 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.220:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:08 crc kubenswrapper[4775]: I0127 11:41:08.421092 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 11:41:08 crc kubenswrapper[4775]: I0127 11:41:08.422574 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 11:41:09 crc kubenswrapper[4775]: I0127 11:41:09.435628 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:09 crc kubenswrapper[4775]: I0127 11:41:09.435923 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:09 crc kubenswrapper[4775]: I0127 11:41:09.725082 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 27 11:41:09 crc kubenswrapper[4775]: I0127 11:41:09.772044 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 27 11:41:10 crc kubenswrapper[4775]: I0127 11:41:10.176039 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 27 11:41:11 crc kubenswrapper[4775]: I0127 11:41:11.525500 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 27 11:41:14 crc kubenswrapper[4775]: I0127 11:41:14.717744 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 11:41:14 crc kubenswrapper[4775]: I0127 11:41:14.719216 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 11:41:14 crc kubenswrapper[4775]: I0127 11:41:14.722765 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 11:41:14 crc kubenswrapper[4775]: I0127 11:41:14.726549 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 11:41:15 crc kubenswrapper[4775]: I0127 11:41:15.212066 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 11:41:15 crc kubenswrapper[4775]: I0127 11:41:15.220347 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 11:41:18 crc kubenswrapper[4775]: I0127 11:41:18.426107 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 11:41:18 crc kubenswrapper[4775]: I0127 11:41:18.427618 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 11:41:18 crc kubenswrapper[4775]: I0127 11:41:18.430788 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 11:41:19 crc kubenswrapper[4775]: I0127 11:41:19.265019 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.601272 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.601827 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="7617063e-fa32-45fc-b06e-7ecff629f7db" containerName="nova-cell0-conductor-conductor" containerID="cri-o://7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" gracePeriod=30 Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.640717 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.640995 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="6628a06a-9e13-4402-94d9-1df5c42e3c7a" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://6113d21a389d355a543441146f4850a74a219c9b51229d74f630ef6722366592" gracePeriod=30 Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.650990 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.651196 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7600201f-fb6c-4eb7-8b0a-19078b93c131" containerName="nova-scheduler-scheduler" containerID="cri-o://a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512" gracePeriod=30 Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.658581 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.750461 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.750876 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-log" containerID="cri-o://680998a678e870e249e755477f30b2a4504f760bab8f79f38f76f47fa33c362f" gracePeriod=30 Jan 27 11:41:20 crc kubenswrapper[4775]: I0127 11:41:20.750957 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-api" containerID="cri-o://3fb6dba1ef6aef5504b2fb4bb7d21e98e86e3a8d11057b678b01d97ea7febc53" gracePeriod=30 Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.285940 4775 generic.go:334] "Generic (PLEG): container finished" podID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerID="680998a678e870e249e755477f30b2a4504f760bab8f79f38f76f47fa33c362f" exitCode=143 Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.286077 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"724fa5b2-f306-42e9-8781-76a9166bd19e","Type":"ContainerDied","Data":"680998a678e870e249e755477f30b2a4504f760bab8f79f38f76f47fa33c362f"} Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.289714 4775 generic.go:334] "Generic (PLEG): container finished" podID="6628a06a-9e13-4402-94d9-1df5c42e3c7a" containerID="6113d21a389d355a543441146f4850a74a219c9b51229d74f630ef6722366592" exitCode=0 Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.289759 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6628a06a-9e13-4402-94d9-1df5c42e3c7a","Type":"ContainerDied","Data":"6113d21a389d355a543441146f4850a74a219c9b51229d74f630ef6722366592"} Jan 27 11:41:21 crc kubenswrapper[4775]: E0127 11:41:21.396433 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:41:21 crc kubenswrapper[4775]: E0127 11:41:21.402210 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:41:21 crc kubenswrapper[4775]: E0127 11:41:21.403987 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:41:21 crc kubenswrapper[4775]: E0127 11:41:21.404021 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="7617063e-fa32-45fc-b06e-7ecff629f7db" containerName="nova-cell0-conductor-conductor" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.430133 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.618131 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-nova-novncproxy-tls-certs\") pod \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.618212 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-config-data\") pod \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.618248 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-vencrypt-tls-certs\") pod \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.618317 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cv7d\" (UniqueName: \"kubernetes.io/projected/6628a06a-9e13-4402-94d9-1df5c42e3c7a-kube-api-access-7cv7d\") pod \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.618482 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-combined-ca-bundle\") pod \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\" (UID: \"6628a06a-9e13-4402-94d9-1df5c42e3c7a\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.631282 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6628a06a-9e13-4402-94d9-1df5c42e3c7a-kube-api-access-7cv7d" (OuterVolumeSpecName: "kube-api-access-7cv7d") pod "6628a06a-9e13-4402-94d9-1df5c42e3c7a" (UID: "6628a06a-9e13-4402-94d9-1df5c42e3c7a"). InnerVolumeSpecName "kube-api-access-7cv7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.647554 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-config-data" (OuterVolumeSpecName: "config-data") pod "6628a06a-9e13-4402-94d9-1df5c42e3c7a" (UID: "6628a06a-9e13-4402-94d9-1df5c42e3c7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.658909 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6628a06a-9e13-4402-94d9-1df5c42e3c7a" (UID: "6628a06a-9e13-4402-94d9-1df5c42e3c7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.691052 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "6628a06a-9e13-4402-94d9-1df5c42e3c7a" (UID: "6628a06a-9e13-4402-94d9-1df5c42e3c7a"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.706791 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "6628a06a-9e13-4402-94d9-1df5c42e3c7a" (UID: "6628a06a-9e13-4402-94d9-1df5c42e3c7a"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.728575 4775 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.728618 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.728630 4775 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.728651 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cv7d\" (UniqueName: \"kubernetes.io/projected/6628a06a-9e13-4402-94d9-1df5c42e3c7a-kube-api-access-7cv7d\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.728660 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6628a06a-9e13-4402-94d9-1df5c42e3c7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.829278 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.931377 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-config-data\") pod \"7600201f-fb6c-4eb7-8b0a-19078b93c131\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.931446 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zw2s\" (UniqueName: \"kubernetes.io/projected/7600201f-fb6c-4eb7-8b0a-19078b93c131-kube-api-access-5zw2s\") pod \"7600201f-fb6c-4eb7-8b0a-19078b93c131\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.931537 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-combined-ca-bundle\") pod \"7600201f-fb6c-4eb7-8b0a-19078b93c131\" (UID: \"7600201f-fb6c-4eb7-8b0a-19078b93c131\") " Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.935767 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7600201f-fb6c-4eb7-8b0a-19078b93c131-kube-api-access-5zw2s" (OuterVolumeSpecName: "kube-api-access-5zw2s") pod "7600201f-fb6c-4eb7-8b0a-19078b93c131" (UID: "7600201f-fb6c-4eb7-8b0a-19078b93c131"). InnerVolumeSpecName "kube-api-access-5zw2s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.966181 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-config-data" (OuterVolumeSpecName: "config-data") pod "7600201f-fb6c-4eb7-8b0a-19078b93c131" (UID: "7600201f-fb6c-4eb7-8b0a-19078b93c131"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:21 crc kubenswrapper[4775]: I0127 11:41:21.966782 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7600201f-fb6c-4eb7-8b0a-19078b93c131" (UID: "7600201f-fb6c-4eb7-8b0a-19078b93c131"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.036009 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.036104 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zw2s\" (UniqueName: \"kubernetes.io/projected/7600201f-fb6c-4eb7-8b0a-19078b93c131-kube-api-access-5zw2s\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.036129 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7600201f-fb6c-4eb7-8b0a-19078b93c131-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.304026 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6628a06a-9e13-4402-94d9-1df5c42e3c7a","Type":"ContainerDied","Data":"fafac5d47be64962872bf10acf0347810c872eb880366bed3a34d442b4601ca2"} Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.304082 4775 scope.go:117] "RemoveContainer" containerID="6113d21a389d355a543441146f4850a74a219c9b51229d74f630ef6722366592" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.304219 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.308227 4775 generic.go:334] "Generic (PLEG): container finished" podID="7600201f-fb6c-4eb7-8b0a-19078b93c131" containerID="a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512" exitCode=0 Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.308290 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7600201f-fb6c-4eb7-8b0a-19078b93c131","Type":"ContainerDied","Data":"a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512"} Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.308303 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.308323 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7600201f-fb6c-4eb7-8b0a-19078b93c131","Type":"ContainerDied","Data":"2771d331e7612b69f7658eb7e84583cbbafb7ca66178894edd5683aab64a88ec"} Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.308442 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-log" containerID="cri-o://c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81" gracePeriod=30 Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.309487 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-metadata" containerID="cri-o://75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712" gracePeriod=30 Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.345616 4775 scope.go:117] "RemoveContainer" containerID="a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.367028 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.405745 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.406542 4775 scope.go:117] "RemoveContainer" containerID="a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512" Jan 27 11:41:22 crc kubenswrapper[4775]: E0127 11:41:22.417175 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512\": container with ID starting with a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512 not found: ID does not exist" containerID="a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.417529 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512"} err="failed to get container status \"a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512\": rpc error: code = NotFound desc = could not find container \"a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512\": container with ID starting with a89835ef73ce30ff53a947cd76a11630bb6caa54df3a87e07de55111e88bd512 not found: ID does not exist" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.426505 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.441400 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.460471 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: E0127 11:41:22.461102 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7600201f-fb6c-4eb7-8b0a-19078b93c131" containerName="nova-scheduler-scheduler" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.461124 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7600201f-fb6c-4eb7-8b0a-19078b93c131" containerName="nova-scheduler-scheduler" Jan 27 11:41:22 crc kubenswrapper[4775]: E0127 11:41:22.461150 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6628a06a-9e13-4402-94d9-1df5c42e3c7a" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.461159 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6628a06a-9e13-4402-94d9-1df5c42e3c7a" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.461377 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6628a06a-9e13-4402-94d9-1df5c42e3c7a" containerName="nova-cell1-novncproxy-novncproxy" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.461410 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7600201f-fb6c-4eb7-8b0a-19078b93c131" containerName="nova-scheduler-scheduler" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.462183 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.464715 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.465409 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.465833 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.478357 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.479801 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.483647 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.491725 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.515199 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649366 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-config-data\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649506 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649545 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649578 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649681 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649701 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8gnp\" (UniqueName: \"kubernetes.io/projected/8431139c-b870-4787-9a1c-758e9241e776-kube-api-access-f8gnp\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649721 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9lv6\" (UniqueName: \"kubernetes.io/projected/80ce7ac7-056a-44ec-be77-f87a96dc23f5-kube-api-access-n9lv6\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.649756 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751506 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751570 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8gnp\" (UniqueName: \"kubernetes.io/projected/8431139c-b870-4787-9a1c-758e9241e776-kube-api-access-f8gnp\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751594 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9lv6\" (UniqueName: \"kubernetes.io/projected/80ce7ac7-056a-44ec-be77-f87a96dc23f5-kube-api-access-n9lv6\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751653 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751715 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-config-data\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751841 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751879 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.751911 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.757206 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.758869 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-config-data\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.760013 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.760747 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.765020 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.765939 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80ce7ac7-056a-44ec-be77-f87a96dc23f5-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.768102 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9lv6\" (UniqueName: \"kubernetes.io/projected/80ce7ac7-056a-44ec-be77-f87a96dc23f5-kube-api-access-n9lv6\") pod \"nova-cell1-novncproxy-0\" (UID: \"80ce7ac7-056a-44ec-be77-f87a96dc23f5\") " pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.771076 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8gnp\" (UniqueName: \"kubernetes.io/projected/8431139c-b870-4787-9a1c-758e9241e776-kube-api-access-f8gnp\") pod \"nova-scheduler-0\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.791953 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:22 crc kubenswrapper[4775]: I0127 11:41:22.797432 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.260358 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.274968 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.319433 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8431139c-b870-4787-9a1c-758e9241e776","Type":"ContainerStarted","Data":"448721b3663c08d269367448c16aa9457b184a7bc00f3668a17c4a9972f25155"} Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.325061 4775 generic.go:334] "Generic (PLEG): container finished" podID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerID="c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81" exitCode=143 Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.325149 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d743fc7-b5d1-4890-bc22-22de8227323e","Type":"ContainerDied","Data":"c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81"} Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.326220 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"80ce7ac7-056a-44ec-be77-f87a96dc23f5","Type":"ContainerStarted","Data":"0c168ba474b47e1c1b0eb32b31e2979ca59877a54a0bd3aeb7d60c7138384b25"} Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.754804 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6628a06a-9e13-4402-94d9-1df5c42e3c7a" path="/var/lib/kubelet/pods/6628a06a-9e13-4402-94d9-1df5c42e3c7a/volumes" Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.755659 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7600201f-fb6c-4eb7-8b0a-19078b93c131" path="/var/lib/kubelet/pods/7600201f-fb6c-4eb7-8b0a-19078b93c131/volumes" Jan 27 11:41:23 crc kubenswrapper[4775]: I0127 11:41:23.965309 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:23 crc kubenswrapper[4775]: W0127 11:41:23.974809 4775 container.go:586] Failed to update stats for container "/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod724fa5b2_f306_42e9_8781_76a9166bd19e.slice/crio-e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2": error while statting cgroup v2: [read /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod724fa5b2_f306_42e9_8781_76a9166bd19e.slice/crio-e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2/pids.current: no such device], continuing to push stats Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.063440 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.063661 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="f2945fbf-3178-420a-bfaf-d0d9c91d610a" containerName="nova-cell1-conductor-conductor" containerID="cri-o://178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" gracePeriod=30 Jan 27 11:41:24 crc kubenswrapper[4775]: E0127 11:41:24.238133 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod724fa5b2_f306_42e9_8781_76a9166bd19e.slice/crio-e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2\": RecentStats: unable to find data in memory cache]" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.344331 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"80ce7ac7-056a-44ec-be77-f87a96dc23f5","Type":"ContainerStarted","Data":"0d6942f38537a5c467895adb33195c2af219a87282909e2d50ea799fbfcfabbb"} Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.346633 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8431139c-b870-4787-9a1c-758e9241e776","Type":"ContainerStarted","Data":"876d516959295d7e0db711e27a3980ced858832560adced1e7a9b9f0d697bf7f"} Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.348555 4775 generic.go:334] "Generic (PLEG): container finished" podID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerID="3fb6dba1ef6aef5504b2fb4bb7d21e98e86e3a8d11057b678b01d97ea7febc53" exitCode=0 Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.348637 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"724fa5b2-f306-42e9-8781-76a9166bd19e","Type":"ContainerDied","Data":"3fb6dba1ef6aef5504b2fb4bb7d21e98e86e3a8d11057b678b01d97ea7febc53"} Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.348689 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"724fa5b2-f306-42e9-8781-76a9166bd19e","Type":"ContainerDied","Data":"e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2"} Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.348702 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7c2a315ceadc0578d55e831bd94587ca9a3bd781e2214e5d890899c19cb96c2" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.362821 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.362804146 podStartE2EDuration="2.362804146s" podCreationTimestamp="2026-01-27 11:41:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:24.357495401 +0000 UTC m=+1263.499093188" watchObservedRunningTime="2026-01-27 11:41:24.362804146 +0000 UTC m=+1263.504401923" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.365247 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.384372 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.384351076 podStartE2EDuration="2.384351076s" podCreationTimestamp="2026-01-27 11:41:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:24.376662016 +0000 UTC m=+1263.518259813" watchObservedRunningTime="2026-01-27 11:41:24.384351076 +0000 UTC m=+1263.525948863" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.487483 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-internal-tls-certs\") pod \"724fa5b2-f306-42e9-8781-76a9166bd19e\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.487535 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-combined-ca-bundle\") pod \"724fa5b2-f306-42e9-8781-76a9166bd19e\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.487581 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-public-tls-certs\") pod \"724fa5b2-f306-42e9-8781-76a9166bd19e\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.487656 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/724fa5b2-f306-42e9-8781-76a9166bd19e-logs\") pod \"724fa5b2-f306-42e9-8781-76a9166bd19e\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.487752 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48lz2\" (UniqueName: \"kubernetes.io/projected/724fa5b2-f306-42e9-8781-76a9166bd19e-kube-api-access-48lz2\") pod \"724fa5b2-f306-42e9-8781-76a9166bd19e\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.487804 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-config-data\") pod \"724fa5b2-f306-42e9-8781-76a9166bd19e\" (UID: \"724fa5b2-f306-42e9-8781-76a9166bd19e\") " Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.490413 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/724fa5b2-f306-42e9-8781-76a9166bd19e-logs" (OuterVolumeSpecName: "logs") pod "724fa5b2-f306-42e9-8781-76a9166bd19e" (UID: "724fa5b2-f306-42e9-8781-76a9166bd19e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.496719 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/724fa5b2-f306-42e9-8781-76a9166bd19e-kube-api-access-48lz2" (OuterVolumeSpecName: "kube-api-access-48lz2") pod "724fa5b2-f306-42e9-8781-76a9166bd19e" (UID: "724fa5b2-f306-42e9-8781-76a9166bd19e"). InnerVolumeSpecName "kube-api-access-48lz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.524686 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-config-data" (OuterVolumeSpecName: "config-data") pod "724fa5b2-f306-42e9-8781-76a9166bd19e" (UID: "724fa5b2-f306-42e9-8781-76a9166bd19e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.527900 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "724fa5b2-f306-42e9-8781-76a9166bd19e" (UID: "724fa5b2-f306-42e9-8781-76a9166bd19e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.588401 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "724fa5b2-f306-42e9-8781-76a9166bd19e" (UID: "724fa5b2-f306-42e9-8781-76a9166bd19e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.591593 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48lz2\" (UniqueName: \"kubernetes.io/projected/724fa5b2-f306-42e9-8781-76a9166bd19e-kube-api-access-48lz2\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.591624 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.591634 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.591644 4775 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.591652 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/724fa5b2-f306-42e9-8781-76a9166bd19e-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.614412 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "724fa5b2-f306-42e9-8781-76a9166bd19e" (UID: "724fa5b2-f306-42e9-8781-76a9166bd19e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:24 crc kubenswrapper[4775]: I0127 11:41:24.693285 4775 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724fa5b2-f306-42e9-8781-76a9166bd19e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:24 crc kubenswrapper[4775]: E0127 11:41:24.903878 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:41:24 crc kubenswrapper[4775]: E0127 11:41:24.905175 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:41:24 crc kubenswrapper[4775]: E0127 11:41:24.906241 4775 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Jan 27 11:41:24 crc kubenswrapper[4775]: E0127 11:41:24.906285 4775 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="f2945fbf-3178-420a-bfaf-d0d9c91d610a" containerName="nova-cell1-conductor-conductor" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.360413 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.362306 4775 generic.go:334] "Generic (PLEG): container finished" podID="7617063e-fa32-45fc-b06e-7ecff629f7db" containerID="7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" exitCode=0 Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.362423 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.362420 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7617063e-fa32-45fc-b06e-7ecff629f7db","Type":"ContainerDied","Data":"7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7"} Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.362552 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8431139c-b870-4787-9a1c-758e9241e776" containerName="nova-scheduler-scheduler" containerID="cri-o://876d516959295d7e0db711e27a3980ced858832560adced1e7a9b9f0d697bf7f" gracePeriod=30 Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.362615 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"7617063e-fa32-45fc-b06e-7ecff629f7db","Type":"ContainerDied","Data":"7366e2b06b8dfe620b743759b8a53259302cbfecadc69c376be4bc38237a72e8"} Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.362641 4775 scope.go:117] "RemoveContainer" containerID="7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.397624 4775 scope.go:117] "RemoveContainer" containerID="7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" Jan 27 11:41:25 crc kubenswrapper[4775]: E0127 11:41:25.399836 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7\": container with ID starting with 7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7 not found: ID does not exist" containerID="7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.399884 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7"} err="failed to get container status \"7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7\": rpc error: code = NotFound desc = could not find container \"7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7\": container with ID starting with 7385f82059c0dd3ec251ca5ed41c23ecc6fb7127b8cf5081b1132deb92172fc7 not found: ID does not exist" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.440835 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.453121 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.463029 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 27 11:41:25 crc kubenswrapper[4775]: E0127 11:41:25.463568 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7617063e-fa32-45fc-b06e-7ecff629f7db" containerName="nova-cell0-conductor-conductor" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.463593 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7617063e-fa32-45fc-b06e-7ecff629f7db" containerName="nova-cell0-conductor-conductor" Jan 27 11:41:25 crc kubenswrapper[4775]: E0127 11:41:25.463626 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-log" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.463635 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-log" Jan 27 11:41:25 crc kubenswrapper[4775]: E0127 11:41:25.463648 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-api" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.463656 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-api" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.463879 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7617063e-fa32-45fc-b06e-7ecff629f7db" containerName="nova-cell0-conductor-conductor" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.463908 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-log" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.463927 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" containerName="nova-api-api" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.465115 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.468352 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.468606 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.468763 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.473947 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.507558 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-config-data\") pod \"7617063e-fa32-45fc-b06e-7ecff629f7db\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.507631 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-combined-ca-bundle\") pod \"7617063e-fa32-45fc-b06e-7ecff629f7db\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.507788 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz4nd\" (UniqueName: \"kubernetes.io/projected/7617063e-fa32-45fc-b06e-7ecff629f7db-kube-api-access-cz4nd\") pod \"7617063e-fa32-45fc-b06e-7ecff629f7db\" (UID: \"7617063e-fa32-45fc-b06e-7ecff629f7db\") " Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.520308 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7617063e-fa32-45fc-b06e-7ecff629f7db-kube-api-access-cz4nd" (OuterVolumeSpecName: "kube-api-access-cz4nd") pod "7617063e-fa32-45fc-b06e-7ecff629f7db" (UID: "7617063e-fa32-45fc-b06e-7ecff629f7db"). InnerVolumeSpecName "kube-api-access-cz4nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.533715 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7617063e-fa32-45fc-b06e-7ecff629f7db" (UID: "7617063e-fa32-45fc-b06e-7ecff629f7db"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.537704 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-config-data" (OuterVolumeSpecName: "config-data") pod "7617063e-fa32-45fc-b06e-7ecff629f7db" (UID: "7617063e-fa32-45fc-b06e-7ecff629f7db"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.610097 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/451ba9e3-91a7-4fd5-9e95-b827186dee9d-logs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.610184 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66ggw\" (UniqueName: \"kubernetes.io/projected/451ba9e3-91a7-4fd5-9e95-b827186dee9d-kube-api-access-66ggw\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.610429 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.610593 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-public-tls-certs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.610665 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-config-data\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.610823 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.611003 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.611023 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7617063e-fa32-45fc-b06e-7ecff629f7db-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.611036 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz4nd\" (UniqueName: \"kubernetes.io/projected/7617063e-fa32-45fc-b06e-7ecff629f7db-kube-api-access-cz4nd\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.712245 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.712336 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-public-tls-certs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.712373 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-config-data\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.712404 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.712430 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/451ba9e3-91a7-4fd5-9e95-b827186dee9d-logs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.712473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66ggw\" (UniqueName: \"kubernetes.io/projected/451ba9e3-91a7-4fd5-9e95-b827186dee9d-kube-api-access-66ggw\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.713410 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/451ba9e3-91a7-4fd5-9e95-b827186dee9d-logs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.716667 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-config-data\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.716699 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-public-tls-certs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.717778 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.717924 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/451ba9e3-91a7-4fd5-9e95-b827186dee9d-internal-tls-certs\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.734477 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66ggw\" (UniqueName: \"kubernetes.io/projected/451ba9e3-91a7-4fd5-9e95-b827186dee9d-kube-api-access-66ggw\") pod \"nova-api-0\" (UID: \"451ba9e3-91a7-4fd5-9e95-b827186dee9d\") " pod="openstack/nova-api-0" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.756206 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="724fa5b2-f306-42e9-8781-76a9166bd19e" path="/var/lib/kubelet/pods/724fa5b2-f306-42e9-8781-76a9166bd19e/volumes" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.760662 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": read tcp 10.217.0.2:54928->10.217.0.221:8775: read: connection reset by peer" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.760713 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": read tcp 10.217.0.2:54930->10.217.0.221:8775: read: connection reset by peer" Jan 27 11:41:25 crc kubenswrapper[4775]: I0127 11:41:25.789905 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.197136 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.289811 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.323479 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-nova-metadata-tls-certs\") pod \"3d743fc7-b5d1-4890-bc22-22de8227323e\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.323630 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-combined-ca-bundle\") pod \"3d743fc7-b5d1-4890-bc22-22de8227323e\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.323667 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d743fc7-b5d1-4890-bc22-22de8227323e-logs\") pod \"3d743fc7-b5d1-4890-bc22-22de8227323e\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.323735 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-config-data\") pod \"3d743fc7-b5d1-4890-bc22-22de8227323e\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.323797 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfsp7\" (UniqueName: \"kubernetes.io/projected/3d743fc7-b5d1-4890-bc22-22de8227323e-kube-api-access-tfsp7\") pod \"3d743fc7-b5d1-4890-bc22-22de8227323e\" (UID: \"3d743fc7-b5d1-4890-bc22-22de8227323e\") " Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.326795 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d743fc7-b5d1-4890-bc22-22de8227323e-logs" (OuterVolumeSpecName: "logs") pod "3d743fc7-b5d1-4890-bc22-22de8227323e" (UID: "3d743fc7-b5d1-4890-bc22-22de8227323e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.330306 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d743fc7-b5d1-4890-bc22-22de8227323e-kube-api-access-tfsp7" (OuterVolumeSpecName: "kube-api-access-tfsp7") pod "3d743fc7-b5d1-4890-bc22-22de8227323e" (UID: "3d743fc7-b5d1-4890-bc22-22de8227323e"). InnerVolumeSpecName "kube-api-access-tfsp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.357581 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-config-data" (OuterVolumeSpecName: "config-data") pod "3d743fc7-b5d1-4890-bc22-22de8227323e" (UID: "3d743fc7-b5d1-4890-bc22-22de8227323e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.366566 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d743fc7-b5d1-4890-bc22-22de8227323e" (UID: "3d743fc7-b5d1-4890-bc22-22de8227323e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.372738 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.375309 4775 generic.go:334] "Generic (PLEG): container finished" podID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerID="75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712" exitCode=0 Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.375367 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d743fc7-b5d1-4890-bc22-22de8227323e","Type":"ContainerDied","Data":"75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712"} Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.375390 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3d743fc7-b5d1-4890-bc22-22de8227323e","Type":"ContainerDied","Data":"cc025fb49ac4acec92f7c01cfadfc52510ce9eff9a04601bf2e7f2c28847302c"} Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.375392 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.375408 4775 scope.go:117] "RemoveContainer" containerID="75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.378823 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"451ba9e3-91a7-4fd5-9e95-b827186dee9d","Type":"ContainerStarted","Data":"a20713d4384bfaba787f9cde797a335807fd0a15bcdeb3c72de591e49ae0c218"} Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.386435 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "3d743fc7-b5d1-4890-bc22-22de8227323e" (UID: "3d743fc7-b5d1-4890-bc22-22de8227323e"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.417105 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.423140 4775 scope.go:117] "RemoveContainer" containerID="c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.425479 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfsp7\" (UniqueName: \"kubernetes.io/projected/3d743fc7-b5d1-4890-bc22-22de8227323e-kube-api-access-tfsp7\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.425507 4775 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.425518 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.425527 4775 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3d743fc7-b5d1-4890-bc22-22de8227323e-logs\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.425540 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3d743fc7-b5d1-4890-bc22-22de8227323e-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.432596 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.441827 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: E0127 11:41:26.442240 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-log" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.442258 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-log" Jan 27 11:41:26 crc kubenswrapper[4775]: E0127 11:41:26.442291 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-metadata" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.442298 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-metadata" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.442462 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-metadata" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.442479 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" containerName="nova-metadata-log" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.443173 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.448306 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.453479 4775 scope.go:117] "RemoveContainer" containerID="75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712" Jan 27 11:41:26 crc kubenswrapper[4775]: E0127 11:41:26.454532 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712\": container with ID starting with 75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712 not found: ID does not exist" containerID="75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.454585 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712"} err="failed to get container status \"75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712\": rpc error: code = NotFound desc = could not find container \"75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712\": container with ID starting with 75a422fff7037618171ff4d4de6dbe908ee2dda802e1c2ae245218ba71bd3712 not found: ID does not exist" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.454614 4775 scope.go:117] "RemoveContainer" containerID="c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81" Jan 27 11:41:26 crc kubenswrapper[4775]: E0127 11:41:26.454986 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81\": container with ID starting with c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81 not found: ID does not exist" containerID="c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.455016 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81"} err="failed to get container status \"c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81\": rpc error: code = NotFound desc = could not find container \"c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81\": container with ID starting with c24e90c8acde16262b42b5d64c0ebb64a932f2c5044767ee182c4a6d879efd81 not found: ID does not exist" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.455302 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.628328 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21548904-8b74-4b9b-81fb-df04e62dc7df-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.628592 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28r5f\" (UniqueName: \"kubernetes.io/projected/21548904-8b74-4b9b-81fb-df04e62dc7df-kube-api-access-28r5f\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.628656 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21548904-8b74-4b9b-81fb-df04e62dc7df-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.706023 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.715231 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.730300 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21548904-8b74-4b9b-81fb-df04e62dc7df-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.730414 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28r5f\" (UniqueName: \"kubernetes.io/projected/21548904-8b74-4b9b-81fb-df04e62dc7df-kube-api-access-28r5f\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.730471 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21548904-8b74-4b9b-81fb-df04e62dc7df-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.730827 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.735849 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21548904-8b74-4b9b-81fb-df04e62dc7df-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.736265 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21548904-8b74-4b9b-81fb-df04e62dc7df-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.744901 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.745020 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.756997 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.757326 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.761874 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28r5f\" (UniqueName: \"kubernetes.io/projected/21548904-8b74-4b9b-81fb-df04e62dc7df-kube-api-access-28r5f\") pod \"nova-cell0-conductor-0\" (UID: \"21548904-8b74-4b9b-81fb-df04e62dc7df\") " pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.774018 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.832715 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-config-data\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.832791 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-logs\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.832852 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.832934 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4b26\" (UniqueName: \"kubernetes.io/projected/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-kube-api-access-z4b26\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.832964 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.934395 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-config-data\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.934708 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-logs\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.934753 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.934809 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4b26\" (UniqueName: \"kubernetes.io/projected/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-kube-api-access-z4b26\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.934835 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.935881 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-logs\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.939798 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.939809 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.940027 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-config-data\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:26 crc kubenswrapper[4775]: I0127 11:41:26.955061 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4b26\" (UniqueName: \"kubernetes.io/projected/b95ff32a-7b7f-43d8-b521-6d07c8d78c99-kube-api-access-z4b26\") pod \"nova-metadata-0\" (UID: \"b95ff32a-7b7f-43d8-b521-6d07c8d78c99\") " pod="openstack/nova-metadata-0" Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.075632 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.232558 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.312808 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 27 11:41:27 crc kubenswrapper[4775]: W0127 11:41:27.316528 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb95ff32a_7b7f_43d8_b521_6d07c8d78c99.slice/crio-20d0b07ebde424bc8b7c2b07a218d1626f9a5f95fe8cbaba17b298d523aa75c9 WatchSource:0}: Error finding container 20d0b07ebde424bc8b7c2b07a218d1626f9a5f95fe8cbaba17b298d523aa75c9: Status 404 returned error can't find the container with id 20d0b07ebde424bc8b7c2b07a218d1626f9a5f95fe8cbaba17b298d523aa75c9 Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.389426 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b95ff32a-7b7f-43d8-b521-6d07c8d78c99","Type":"ContainerStarted","Data":"20d0b07ebde424bc8b7c2b07a218d1626f9a5f95fe8cbaba17b298d523aa75c9"} Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.392088 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"21548904-8b74-4b9b-81fb-df04e62dc7df","Type":"ContainerStarted","Data":"83f9f6683189e5a3d7fdfa25e43bb3cf9538df89446775cf2db0019019ccadca"} Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.398026 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"451ba9e3-91a7-4fd5-9e95-b827186dee9d","Type":"ContainerStarted","Data":"1faaf895137c94f1f4724ab67c10eea2055f0e7b9a65e0befb41b89a169cfcde"} Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.398066 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"451ba9e3-91a7-4fd5-9e95-b827186dee9d","Type":"ContainerStarted","Data":"bd11b7d1772192a1c787c50ef54d4452b93c364617b584dc04bf19de103c2092"} Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.420857 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.420840827 podStartE2EDuration="2.420840827s" podCreationTimestamp="2026-01-27 11:41:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:27.418629326 +0000 UTC m=+1266.560227103" watchObservedRunningTime="2026-01-27 11:41:27.420840827 +0000 UTC m=+1266.562438604" Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.758315 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d743fc7-b5d1-4890-bc22-22de8227323e" path="/var/lib/kubelet/pods/3d743fc7-b5d1-4890-bc22-22de8227323e/volumes" Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.759581 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7617063e-fa32-45fc-b06e-7ecff629f7db" path="/var/lib/kubelet/pods/7617063e-fa32-45fc-b06e-7ecff629f7db/volumes" Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.792879 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:27 crc kubenswrapper[4775]: I0127 11:41:27.798550 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 11:41:28 crc kubenswrapper[4775]: I0127 11:41:28.408303 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"21548904-8b74-4b9b-81fb-df04e62dc7df","Type":"ContainerStarted","Data":"495e47ab9dbb841fe45b54288e3a3a9b08b1650f9196643c2c010473caf3db1f"} Jan 27 11:41:28 crc kubenswrapper[4775]: I0127 11:41:28.408426 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:28 crc kubenswrapper[4775]: I0127 11:41:28.410621 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b95ff32a-7b7f-43d8-b521-6d07c8d78c99","Type":"ContainerStarted","Data":"721af181f4ec9c0e5860b71c7f952716e6c800979483f969a6c73597a138efac"} Jan 27 11:41:28 crc kubenswrapper[4775]: I0127 11:41:28.410702 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b95ff32a-7b7f-43d8-b521-6d07c8d78c99","Type":"ContainerStarted","Data":"1dd5f8c2cba5b6c42ddbf68ee4c0d313f4a062d523c00451a1df61b5ad197c22"} Jan 27 11:41:28 crc kubenswrapper[4775]: I0127 11:41:28.449405 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.449385483 podStartE2EDuration="2.449385483s" podCreationTimestamp="2026-01-27 11:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:28.427486023 +0000 UTC m=+1267.569083810" watchObservedRunningTime="2026-01-27 11:41:28.449385483 +0000 UTC m=+1267.590983260" Jan 27 11:41:28 crc kubenswrapper[4775]: I0127 11:41:28.479628 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.479607169 podStartE2EDuration="2.479607169s" podCreationTimestamp="2026-01-27 11:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:28.46499199 +0000 UTC m=+1267.606589767" watchObservedRunningTime="2026-01-27 11:41:28.479607169 +0000 UTC m=+1267.621204946" Jan 27 11:41:28 crc kubenswrapper[4775]: I0127 11:41:28.487828 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.042801 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.180780 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-combined-ca-bundle\") pod \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.181246 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-config-data\") pod \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.181379 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb2kw\" (UniqueName: \"kubernetes.io/projected/f2945fbf-3178-420a-bfaf-d0d9c91d610a-kube-api-access-fb2kw\") pod \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\" (UID: \"f2945fbf-3178-420a-bfaf-d0d9c91d610a\") " Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.200654 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2945fbf-3178-420a-bfaf-d0d9c91d610a-kube-api-access-fb2kw" (OuterVolumeSpecName: "kube-api-access-fb2kw") pod "f2945fbf-3178-420a-bfaf-d0d9c91d610a" (UID: "f2945fbf-3178-420a-bfaf-d0d9c91d610a"). InnerVolumeSpecName "kube-api-access-fb2kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.215190 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2945fbf-3178-420a-bfaf-d0d9c91d610a" (UID: "f2945fbf-3178-420a-bfaf-d0d9c91d610a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.222060 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-config-data" (OuterVolumeSpecName: "config-data") pod "f2945fbf-3178-420a-bfaf-d0d9c91d610a" (UID: "f2945fbf-3178-420a-bfaf-d0d9c91d610a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.283779 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.283828 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2945fbf-3178-420a-bfaf-d0d9c91d610a-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.283837 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb2kw\" (UniqueName: \"kubernetes.io/projected/f2945fbf-3178-420a-bfaf-d0d9c91d610a-kube-api-access-fb2kw\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.419894 4775 generic.go:334] "Generic (PLEG): container finished" podID="f2945fbf-3178-420a-bfaf-d0d9c91d610a" containerID="178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" exitCode=0 Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.419956 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f2945fbf-3178-420a-bfaf-d0d9c91d610a","Type":"ContainerDied","Data":"178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a"} Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.420016 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f2945fbf-3178-420a-bfaf-d0d9c91d610a","Type":"ContainerDied","Data":"f094ab71659251ae7c395f5253917a11c7fff1315a6946167b9b612c28b6876f"} Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.420038 4775 scope.go:117] "RemoveContainer" containerID="178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.420254 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.442341 4775 scope.go:117] "RemoveContainer" containerID="178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" Jan 27 11:41:29 crc kubenswrapper[4775]: E0127 11:41:29.445010 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a\": container with ID starting with 178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a not found: ID does not exist" containerID="178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.445053 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a"} err="failed to get container status \"178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a\": rpc error: code = NotFound desc = could not find container \"178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a\": container with ID starting with 178bca69bb9546de4e576abba98ba1502a13f43c604d9b869bf6f36fbde80e1a not found: ID does not exist" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.455092 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.463249 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.476516 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:41:29 crc kubenswrapper[4775]: E0127 11:41:29.476958 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2945fbf-3178-420a-bfaf-d0d9c91d610a" containerName="nova-cell1-conductor-conductor" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.476980 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2945fbf-3178-420a-bfaf-d0d9c91d610a" containerName="nova-cell1-conductor-conductor" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.477336 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2945fbf-3178-420a-bfaf-d0d9c91d610a" containerName="nova-cell1-conductor-conductor" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.478023 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.483273 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.490313 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.587631 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d213b2-8a0b-479c-8c94-148f1afe1db0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.587882 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rppqf\" (UniqueName: \"kubernetes.io/projected/c8d213b2-8a0b-479c-8c94-148f1afe1db0-kube-api-access-rppqf\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.588393 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d213b2-8a0b-479c-8c94-148f1afe1db0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.690265 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rppqf\" (UniqueName: \"kubernetes.io/projected/c8d213b2-8a0b-479c-8c94-148f1afe1db0-kube-api-access-rppqf\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.690372 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d213b2-8a0b-479c-8c94-148f1afe1db0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.690428 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d213b2-8a0b-479c-8c94-148f1afe1db0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.693536 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8d213b2-8a0b-479c-8c94-148f1afe1db0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.693658 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8d213b2-8a0b-479c-8c94-148f1afe1db0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.710836 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rppqf\" (UniqueName: \"kubernetes.io/projected/c8d213b2-8a0b-479c-8c94-148f1afe1db0-kube-api-access-rppqf\") pod \"nova-cell1-conductor-0\" (UID: \"c8d213b2-8a0b-479c-8c94-148f1afe1db0\") " pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.756563 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2945fbf-3178-420a-bfaf-d0d9c91d610a" path="/var/lib/kubelet/pods/f2945fbf-3178-420a-bfaf-d0d9c91d610a/volumes" Jan 27 11:41:29 crc kubenswrapper[4775]: I0127 11:41:29.792715 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:30 crc kubenswrapper[4775]: I0127 11:41:30.003706 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:41:30 crc kubenswrapper[4775]: I0127 11:41:30.149044 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 27 11:41:30 crc kubenswrapper[4775]: W0127 11:41:30.155154 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8d213b2_8a0b_479c_8c94_148f1afe1db0.slice/crio-6ffc2485d9cc231a0b219b706de052e166146ecd42300826ca47898c80157335 WatchSource:0}: Error finding container 6ffc2485d9cc231a0b219b706de052e166146ecd42300826ca47898c80157335: Status 404 returned error can't find the container with id 6ffc2485d9cc231a0b219b706de052e166146ecd42300826ca47898c80157335 Jan 27 11:41:30 crc kubenswrapper[4775]: I0127 11:41:30.429007 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c8d213b2-8a0b-479c-8c94-148f1afe1db0","Type":"ContainerStarted","Data":"b8e639c27679e58dbb9eace93f97e8e6619add6f6af8e248118196a362f0cd3c"} Jan 27 11:41:30 crc kubenswrapper[4775]: I0127 11:41:30.429315 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c8d213b2-8a0b-479c-8c94-148f1afe1db0","Type":"ContainerStarted","Data":"6ffc2485d9cc231a0b219b706de052e166146ecd42300826ca47898c80157335"} Jan 27 11:41:30 crc kubenswrapper[4775]: I0127 11:41:30.429530 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:30 crc kubenswrapper[4775]: I0127 11:41:30.449493 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.449472553 podStartE2EDuration="1.449472553s" podCreationTimestamp="2026-01-27 11:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:30.444040204 +0000 UTC m=+1269.585637991" watchObservedRunningTime="2026-01-27 11:41:30.449472553 +0000 UTC m=+1269.591070330" Jan 27 11:41:32 crc kubenswrapper[4775]: I0127 11:41:32.077128 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:41:32 crc kubenswrapper[4775]: I0127 11:41:32.078487 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 27 11:41:32 crc kubenswrapper[4775]: I0127 11:41:32.792226 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:32 crc kubenswrapper[4775]: I0127 11:41:32.833366 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:33 crc kubenswrapper[4775]: I0127 11:41:33.099054 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="01ba029b-2296-4519-b6b1-04674355258f" containerName="rabbitmq" containerID="cri-o://0bbda45d64c3d5291022cfefd67ac29a65fcce1e708b8976ccb1047b144eacb1" gracePeriod=604796 Jan 27 11:41:33 crc kubenswrapper[4775]: I0127 11:41:33.468330 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 27 11:41:33 crc kubenswrapper[4775]: I0127 11:41:33.948886 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" containerName="rabbitmq" containerID="cri-o://d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b" gracePeriod=604797 Jan 27 11:41:35 crc kubenswrapper[4775]: I0127 11:41:35.790734 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:41:35 crc kubenswrapper[4775]: I0127 11:41:35.791223 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 27 11:41:36 crc kubenswrapper[4775]: I0127 11:41:36.804688 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="451ba9e3-91a7-4fd5-9e95-b827186dee9d" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.225:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:36 crc kubenswrapper[4775]: I0127 11:41:36.804745 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="451ba9e3-91a7-4fd5-9e95-b827186dee9d" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.225:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:36 crc kubenswrapper[4775]: I0127 11:41:36.818003 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 27 11:41:37 crc kubenswrapper[4775]: I0127 11:41:37.076609 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 11:41:37 crc kubenswrapper[4775]: I0127 11:41:37.077252 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 27 11:41:38 crc kubenswrapper[4775]: I0127 11:41:38.095654 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b95ff32a-7b7f-43d8-b521-6d07c8d78c99" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.227:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:38 crc kubenswrapper[4775]: I0127 11:41:38.095685 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b95ff32a-7b7f-43d8-b521-6d07c8d78c99" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.227:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.513089 4775 generic.go:334] "Generic (PLEG): container finished" podID="01ba029b-2296-4519-b6b1-04674355258f" containerID="0bbda45d64c3d5291022cfefd67ac29a65fcce1e708b8976ccb1047b144eacb1" exitCode=0 Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.513308 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01ba029b-2296-4519-b6b1-04674355258f","Type":"ContainerDied","Data":"0bbda45d64c3d5291022cfefd67ac29a65fcce1e708b8976ccb1047b144eacb1"} Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.780463 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.851188 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.856067 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.95:5671: connect: connection refused" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891368 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-confd\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891425 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-server-conf\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891465 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891491 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-plugins-conf\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891582 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-plugins\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891685 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ba029b-2296-4519-b6b1-04674355258f-erlang-cookie-secret\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891724 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-config-data\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891745 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ba029b-2296-4519-b6b1-04674355258f-pod-info\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891778 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-erlang-cookie\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891815 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwnfd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-kube-api-access-mwnfd\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.891873 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-tls\") pod \"01ba029b-2296-4519-b6b1-04674355258f\" (UID: \"01ba029b-2296-4519-b6b1-04674355258f\") " Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.893731 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.894684 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.894703 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.902024 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.904089 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.915168 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-kube-api-access-mwnfd" (OuterVolumeSpecName: "kube-api-access-mwnfd") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "kube-api-access-mwnfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.915460 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/01ba029b-2296-4519-b6b1-04674355258f-pod-info" (OuterVolumeSpecName: "pod-info") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.915320 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ba029b-2296-4519-b6b1-04674355258f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.961406 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-config-data" (OuterVolumeSpecName: "config-data") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:41:39 crc kubenswrapper[4775]: I0127 11:41:39.973999 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-server-conf" (OuterVolumeSpecName: "server-conf") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999620 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwnfd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-kube-api-access-mwnfd\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999652 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999661 4775 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-server-conf\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999677 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999686 4775 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999695 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999704 4775 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01ba029b-2296-4519-b6b1-04674355258f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999712 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01ba029b-2296-4519-b6b1-04674355258f-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999720 4775 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01ba029b-2296-4519-b6b1-04674355258f-pod-info\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:39.999728 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.017173 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.031576 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "01ba029b-2296-4519-b6b1-04674355258f" (UID: "01ba029b-2296-4519-b6b1-04674355258f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.101848 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01ba029b-2296-4519-b6b1-04674355258f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.101887 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.400318 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414044 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-config-data\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414133 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-plugins\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414167 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-confd\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414188 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-server-conf\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414227 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/83263987-4e3c-4e95-9083-bb6a43f52410-erlang-cookie-secret\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414266 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-plugins-conf\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414318 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-erlang-cookie\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414394 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-tls\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414424 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414486 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rgjg\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-kube-api-access-2rgjg\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.414528 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/83263987-4e3c-4e95-9083-bb6a43f52410-pod-info\") pod \"83263987-4e3c-4e95-9083-bb6a43f52410\" (UID: \"83263987-4e3c-4e95-9083-bb6a43f52410\") " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.415941 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.416175 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.416290 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.416889 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.420593 4775 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.420608 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.420518 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83263987-4e3c-4e95-9083-bb6a43f52410-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.420947 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/83263987-4e3c-4e95-9083-bb6a43f52410-pod-info" (OuterVolumeSpecName: "pod-info") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.421135 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.421318 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.424164 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-kube-api-access-2rgjg" (OuterVolumeSpecName: "kube-api-access-2rgjg") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "kube-api-access-2rgjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.462536 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-config-data" (OuterVolumeSpecName: "config-data") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.512173 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-server-conf" (OuterVolumeSpecName: "server-conf") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.523788 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.523812 4775 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/83263987-4e3c-4e95-9083-bb6a43f52410-server-conf\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.523821 4775 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/83263987-4e3c-4e95-9083-bb6a43f52410-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.523830 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.523850 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.523860 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rgjg\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-kube-api-access-2rgjg\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.523869 4775 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/83263987-4e3c-4e95-9083-bb6a43f52410-pod-info\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.524589 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01ba029b-2296-4519-b6b1-04674355258f","Type":"ContainerDied","Data":"3269a97665006c13d48ba616c9cd7abaebd71e3a1886cb0e13cd8dcf70fd57ec"} Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.524717 4775 scope.go:117] "RemoveContainer" containerID="0bbda45d64c3d5291022cfefd67ac29a65fcce1e708b8976ccb1047b144eacb1" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.524891 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.539213 4775 generic.go:334] "Generic (PLEG): container finished" podID="83263987-4e3c-4e95-9083-bb6a43f52410" containerID="d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b" exitCode=0 Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.539261 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.539279 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"83263987-4e3c-4e95-9083-bb6a43f52410","Type":"ContainerDied","Data":"d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b"} Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.540088 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"83263987-4e3c-4e95-9083-bb6a43f52410","Type":"ContainerDied","Data":"85a690e91079df6f4fe47bd15cd231753c08767dae9db9e6943a0ce49bec3588"} Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.543983 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.560275 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "83263987-4e3c-4e95-9083-bb6a43f52410" (UID: "83263987-4e3c-4e95-9083-bb6a43f52410"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.615885 4775 scope.go:117] "RemoveContainer" containerID="74bb5b1c930971f4fe9c5d05e3295a42d673f050d9c75ec7b42c0aa8e59510ca" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.617424 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.625047 4775 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/83263987-4e3c-4e95-9083-bb6a43f52410-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.625241 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.636584 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.655869 4775 scope.go:117] "RemoveContainer" containerID="d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.713561 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: E0127 11:41:40.713992 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" containerName="rabbitmq" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.714007 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" containerName="rabbitmq" Jan 27 11:41:40 crc kubenswrapper[4775]: E0127 11:41:40.714032 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" containerName="setup-container" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.714040 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" containerName="setup-container" Jan 27 11:41:40 crc kubenswrapper[4775]: E0127 11:41:40.714050 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ba029b-2296-4519-b6b1-04674355258f" containerName="setup-container" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.714059 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ba029b-2296-4519-b6b1-04674355258f" containerName="setup-container" Jan 27 11:41:40 crc kubenswrapper[4775]: E0127 11:41:40.714087 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ba029b-2296-4519-b6b1-04674355258f" containerName="rabbitmq" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.714094 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ba029b-2296-4519-b6b1-04674355258f" containerName="rabbitmq" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.717028 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" containerName="rabbitmq" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.717067 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ba029b-2296-4519-b6b1-04674355258f" containerName="rabbitmq" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.718555 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.723063 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.723306 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-44htb" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.723320 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.723344 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.723351 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.723370 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.723709 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.735272 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.745308 4775 scope.go:117] "RemoveContainer" containerID="235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.783393 4775 scope.go:117] "RemoveContainer" containerID="d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b" Jan 27 11:41:40 crc kubenswrapper[4775]: E0127 11:41:40.783836 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b\": container with ID starting with d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b not found: ID does not exist" containerID="d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.783880 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b"} err="failed to get container status \"d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b\": rpc error: code = NotFound desc = could not find container \"d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b\": container with ID starting with d54befb859162fef155815d5c780e852566cb7ecc91ab5b13141e6e0162d715b not found: ID does not exist" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.783908 4775 scope.go:117] "RemoveContainer" containerID="235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55" Jan 27 11:41:40 crc kubenswrapper[4775]: E0127 11:41:40.784219 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55\": container with ID starting with 235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55 not found: ID does not exist" containerID="235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.784259 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55"} err="failed to get container status \"235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55\": rpc error: code = NotFound desc = could not find container \"235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55\": container with ID starting with 235a2bcade411c0041b8f0d1e4990354913e22f899510312ca856872bd097b55 not found: ID does not exist" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829561 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829690 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829756 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-config-data\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829796 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829812 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829857 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt44s\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-kube-api-access-jt44s\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829922 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c46c48a-ba77-4494-bc4e-f463a4072952-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829945 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c46c48a-ba77-4494-bc4e-f463a4072952-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.829967 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.830001 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.830025 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.871362 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.879343 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.898733 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.900801 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.902849 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.903009 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-gp9fv" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.902885 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.903318 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.903431 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.903660 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.904392 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.918873 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934276 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934350 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934377 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-config-data\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934410 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934429 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934479 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt44s\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-kube-api-access-jt44s\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934525 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c46c48a-ba77-4494-bc4e-f463a4072952-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934544 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c46c48a-ba77-4494-bc4e-f463a4072952-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934576 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934606 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.934628 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.936270 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.936432 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.937748 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.937829 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6c46c48a-ba77-4494-bc4e-f463a4072952-config-data\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.938660 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.939787 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.940216 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6c46c48a-ba77-4494-bc4e-f463a4072952-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.941961 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6c46c48a-ba77-4494-bc4e-f463a4072952-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.943480 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.946306 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.954150 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt44s\" (UniqueName: \"kubernetes.io/projected/6c46c48a-ba77-4494-bc4e-f463a4072952-kube-api-access-jt44s\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:40 crc kubenswrapper[4775]: I0127 11:41:40.987191 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"6c46c48a-ba77-4494-bc4e-f463a4072952\") " pod="openstack/rabbitmq-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036688 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036732 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036758 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036799 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036829 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036848 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036873 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqqz8\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-kube-api-access-mqqz8\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036894 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036933 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.036987 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.037008 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.048514 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.139639 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.139934 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.139961 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140266 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140675 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140783 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140836 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140866 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140897 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqqz8\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-kube-api-access-mqqz8\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140920 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.140955 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.141072 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.141223 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.142305 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.142871 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.144180 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.144801 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.144867 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.145721 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.155123 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.155120 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.162041 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqqz8\" (UniqueName: \"kubernetes.io/projected/bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d-kube-api-access-mqqz8\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.170484 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d\") " pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.222349 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.490627 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.551754 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6c46c48a-ba77-4494-bc4e-f463a4072952","Type":"ContainerStarted","Data":"a1e55a1eea034ce3d2707a029a15d3cb21215a7a1edf9f4c3f4c4b2e615390a5"} Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.691620 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.764638 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ba029b-2296-4519-b6b1-04674355258f" path="/var/lib/kubelet/pods/01ba029b-2296-4519-b6b1-04674355258f/volumes" Jan 27 11:41:41 crc kubenswrapper[4775]: I0127 11:41:41.765976 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83263987-4e3c-4e95-9083-bb6a43f52410" path="/var/lib/kubelet/pods/83263987-4e3c-4e95-9083-bb6a43f52410/volumes" Jan 27 11:41:42 crc kubenswrapper[4775]: I0127 11:41:42.563706 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d","Type":"ContainerStarted","Data":"325be91f95532ec391b58080d6515074fcc561c3699a2176a132f7fad241a067"} Jan 27 11:41:43 crc kubenswrapper[4775]: I0127 11:41:43.577726 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6c46c48a-ba77-4494-bc4e-f463a4072952","Type":"ContainerStarted","Data":"36fdf46333226ef36a60bae5ba2567a2bed7c60248a3525289f10e463659609d"} Jan 27 11:41:43 crc kubenswrapper[4775]: I0127 11:41:43.579858 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d","Type":"ContainerStarted","Data":"81eaee1031536e651f818c97c18baa543f83a2f9dd9e2588f54dca81587b369b"} Jan 27 11:41:44 crc kubenswrapper[4775]: I0127 11:41:44.563151 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="01ba029b-2296-4519-b6b1-04674355258f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.94:5671: i/o timeout" Jan 27 11:41:45 crc kubenswrapper[4775]: I0127 11:41:45.798966 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 11:41:45 crc kubenswrapper[4775]: I0127 11:41:45.799743 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 11:41:45 crc kubenswrapper[4775]: I0127 11:41:45.800655 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 27 11:41:45 crc kubenswrapper[4775]: I0127 11:41:45.805875 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 11:41:46 crc kubenswrapper[4775]: I0127 11:41:46.606120 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 27 11:41:46 crc kubenswrapper[4775]: I0127 11:41:46.613287 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 27 11:41:47 crc kubenswrapper[4775]: I0127 11:41:47.080856 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 11:41:47 crc kubenswrapper[4775]: I0127 11:41:47.082916 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 27 11:41:47 crc kubenswrapper[4775]: I0127 11:41:47.091500 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 11:41:47 crc kubenswrapper[4775]: I0127 11:41:47.618716 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.827334 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-g4shv"] Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.844640 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.850129 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.858103 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-g4shv"] Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.922350 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-swift-storage-0\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.922431 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-svc\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.922471 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-openstack-edpm-ipam\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.922590 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-nb\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.922675 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-sb\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.922731 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp58g\" (UniqueName: \"kubernetes.io/projected/b5dacfeb-690d-4289-ae2d-0123e4435d4a-kube-api-access-rp58g\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:48 crc kubenswrapper[4775]: I0127 11:41:48.922796 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-config\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.024416 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-swift-storage-0\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.024480 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-svc\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.024498 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-openstack-edpm-ipam\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.024542 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-nb\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.024582 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-sb\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.024613 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp58g\" (UniqueName: \"kubernetes.io/projected/b5dacfeb-690d-4289-ae2d-0123e4435d4a-kube-api-access-rp58g\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.024634 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-config\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.025593 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-config\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.025599 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-swift-storage-0\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.025867 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-svc\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.025981 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-nb\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.026103 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-openstack-edpm-ipam\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.026115 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-sb\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.052167 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp58g\" (UniqueName: \"kubernetes.io/projected/b5dacfeb-690d-4289-ae2d-0123e4435d4a-kube-api-access-rp58g\") pod \"dnsmasq-dns-668b55cdd7-g4shv\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.179700 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.602937 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-g4shv"] Jan 27 11:41:49 crc kubenswrapper[4775]: I0127 11:41:49.638831 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" event={"ID":"b5dacfeb-690d-4289-ae2d-0123e4435d4a","Type":"ContainerStarted","Data":"df8f05a4df2923539bd706608b950d7b669b477c0ab0a1ec9d75d5d196841bbe"} Jan 27 11:41:50 crc kubenswrapper[4775]: I0127 11:41:50.651305 4775 generic.go:334] "Generic (PLEG): container finished" podID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerID="65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1" exitCode=0 Jan 27 11:41:50 crc kubenswrapper[4775]: I0127 11:41:50.651427 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" event={"ID":"b5dacfeb-690d-4289-ae2d-0123e4435d4a","Type":"ContainerDied","Data":"65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1"} Jan 27 11:41:51 crc kubenswrapper[4775]: I0127 11:41:51.660932 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" event={"ID":"b5dacfeb-690d-4289-ae2d-0123e4435d4a","Type":"ContainerStarted","Data":"7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7"} Jan 27 11:41:51 crc kubenswrapper[4775]: I0127 11:41:51.661318 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:51 crc kubenswrapper[4775]: I0127 11:41:51.692268 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" podStartSLOduration=3.692243863 podStartE2EDuration="3.692243863s" podCreationTimestamp="2026-01-27 11:41:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:51.6815763 +0000 UTC m=+1290.823174117" watchObservedRunningTime="2026-01-27 11:41:51.692243863 +0000 UTC m=+1290.833841680" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.696986 4775 generic.go:334] "Generic (PLEG): container finished" podID="8431139c-b870-4787-9a1c-758e9241e776" containerID="876d516959295d7e0db711e27a3980ced858832560adced1e7a9b9f0d697bf7f" exitCode=137 Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.697087 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8431139c-b870-4787-9a1c-758e9241e776","Type":"ContainerDied","Data":"876d516959295d7e0db711e27a3980ced858832560adced1e7a9b9f0d697bf7f"} Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.697620 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8431139c-b870-4787-9a1c-758e9241e776","Type":"ContainerDied","Data":"448721b3663c08d269367448c16aa9457b184a7bc00f3668a17c4a9972f25155"} Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.697637 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="448721b3663c08d269367448c16aa9457b184a7bc00f3668a17c4a9972f25155" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.746052 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.868288 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-config-data\") pod \"8431139c-b870-4787-9a1c-758e9241e776\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.868394 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8gnp\" (UniqueName: \"kubernetes.io/projected/8431139c-b870-4787-9a1c-758e9241e776-kube-api-access-f8gnp\") pod \"8431139c-b870-4787-9a1c-758e9241e776\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.868420 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-combined-ca-bundle\") pod \"8431139c-b870-4787-9a1c-758e9241e776\" (UID: \"8431139c-b870-4787-9a1c-758e9241e776\") " Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.873673 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8431139c-b870-4787-9a1c-758e9241e776-kube-api-access-f8gnp" (OuterVolumeSpecName: "kube-api-access-f8gnp") pod "8431139c-b870-4787-9a1c-758e9241e776" (UID: "8431139c-b870-4787-9a1c-758e9241e776"). InnerVolumeSpecName "kube-api-access-f8gnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.896281 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8431139c-b870-4787-9a1c-758e9241e776" (UID: "8431139c-b870-4787-9a1c-758e9241e776"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.900204 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-config-data" (OuterVolumeSpecName: "config-data") pod "8431139c-b870-4787-9a1c-758e9241e776" (UID: "8431139c-b870-4787-9a1c-758e9241e776"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.970493 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.970527 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8gnp\" (UniqueName: \"kubernetes.io/projected/8431139c-b870-4787-9a1c-758e9241e776-kube-api-access-f8gnp\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:55 crc kubenswrapper[4775]: I0127 11:41:55.970536 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8431139c-b870-4787-9a1c-758e9241e776-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.705106 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.738380 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.761561 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.779508 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:56 crc kubenswrapper[4775]: E0127 11:41:56.779994 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8431139c-b870-4787-9a1c-758e9241e776" containerName="nova-scheduler-scheduler" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.780008 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8431139c-b870-4787-9a1c-758e9241e776" containerName="nova-scheduler-scheduler" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.780211 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8431139c-b870-4787-9a1c-758e9241e776" containerName="nova-scheduler-scheduler" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.780862 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.783872 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.784830 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.887193 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgnz7\" (UniqueName: \"kubernetes.io/projected/a4732753-3f10-4604-89d0-0c074829e53f-kube-api-access-fgnz7\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.887402 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4732753-3f10-4604-89d0-0c074829e53f-config-data\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.887478 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4732753-3f10-4604-89d0-0c074829e53f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.989737 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4732753-3f10-4604-89d0-0c074829e53f-config-data\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.989967 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4732753-3f10-4604-89d0-0c074829e53f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.990107 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgnz7\" (UniqueName: \"kubernetes.io/projected/a4732753-3f10-4604-89d0-0c074829e53f-kube-api-access-fgnz7\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:56 crc kubenswrapper[4775]: I0127 11:41:56.998939 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4732753-3f10-4604-89d0-0c074829e53f-config-data\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:57 crc kubenswrapper[4775]: I0127 11:41:57.001981 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4732753-3f10-4604-89d0-0c074829e53f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:57 crc kubenswrapper[4775]: I0127 11:41:57.007619 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgnz7\" (UniqueName: \"kubernetes.io/projected/a4732753-3f10-4604-89d0-0c074829e53f-kube-api-access-fgnz7\") pod \"nova-scheduler-0\" (UID: \"a4732753-3f10-4604-89d0-0c074829e53f\") " pod="openstack/nova-scheduler-0" Jan 27 11:41:57 crc kubenswrapper[4775]: I0127 11:41:57.100749 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 27 11:41:57 crc kubenswrapper[4775]: I0127 11:41:57.542277 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 27 11:41:57 crc kubenswrapper[4775]: W0127 11:41:57.545799 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4732753_3f10_4604_89d0_0c074829e53f.slice/crio-384ca7b9894a211662ae86a4d5a5ebd3b58f6e596efef5d3d77e7c64ff1f109e WatchSource:0}: Error finding container 384ca7b9894a211662ae86a4d5a5ebd3b58f6e596efef5d3d77e7c64ff1f109e: Status 404 returned error can't find the container with id 384ca7b9894a211662ae86a4d5a5ebd3b58f6e596efef5d3d77e7c64ff1f109e Jan 27 11:41:57 crc kubenswrapper[4775]: I0127 11:41:57.715334 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a4732753-3f10-4604-89d0-0c074829e53f","Type":"ContainerStarted","Data":"384ca7b9894a211662ae86a4d5a5ebd3b58f6e596efef5d3d77e7c64ff1f109e"} Jan 27 11:41:57 crc kubenswrapper[4775]: I0127 11:41:57.779324 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8431139c-b870-4787-9a1c-758e9241e776" path="/var/lib/kubelet/pods/8431139c-b870-4787-9a1c-758e9241e776/volumes" Jan 27 11:41:58 crc kubenswrapper[4775]: I0127 11:41:58.727086 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a4732753-3f10-4604-89d0-0c074829e53f","Type":"ContainerStarted","Data":"0528b6a09ded0f641aee83b5f822082966ea860c25272d3498d9d2637382a76c"} Jan 27 11:41:58 crc kubenswrapper[4775]: I0127 11:41:58.752843 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.752825679 podStartE2EDuration="2.752825679s" podCreationTimestamp="2026-01-27 11:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:41:58.749684854 +0000 UTC m=+1297.891282641" watchObservedRunningTime="2026-01-27 11:41:58.752825679 +0000 UTC m=+1297.894423456" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.182289 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.269790 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-dvccn"] Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.270069 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" podUID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerName="dnsmasq-dns" containerID="cri-o://cceb38c9f507e6c4fd34c4cca53a771be807a04a895235a4301c6341b1fac77c" gracePeriod=10 Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.413391 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66fc59ccbf-knrgp"] Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.414885 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.431264 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66fc59ccbf-knrgp"] Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.538866 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-dns-swift-storage-0\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.538931 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-ovsdbserver-sb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.538978 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9mnb\" (UniqueName: \"kubernetes.io/projected/f6c54a70-a562-4fef-b3fe-14e2a3029229-kube-api-access-b9mnb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.539128 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-ovsdbserver-nb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.539288 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-config\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.539328 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-openstack-edpm-ipam\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.539485 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-dns-svc\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.641436 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-config\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.641507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-openstack-edpm-ipam\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.641627 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-dns-svc\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.641739 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-dns-swift-storage-0\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.641820 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-ovsdbserver-sb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.641916 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9mnb\" (UniqueName: \"kubernetes.io/projected/f6c54a70-a562-4fef-b3fe-14e2a3029229-kube-api-access-b9mnb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.641994 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-ovsdbserver-nb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.642400 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-config\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.643100 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-dns-swift-storage-0\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.643159 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-ovsdbserver-nb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.644144 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-dns-svc\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.644400 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-openstack-edpm-ipam\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.644569 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f6c54a70-a562-4fef-b3fe-14e2a3029229-ovsdbserver-sb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.671537 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9mnb\" (UniqueName: \"kubernetes.io/projected/f6c54a70-a562-4fef-b3fe-14e2a3029229-kube-api-access-b9mnb\") pod \"dnsmasq-dns-66fc59ccbf-knrgp\" (UID: \"f6c54a70-a562-4fef-b3fe-14e2a3029229\") " pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.741234 4775 generic.go:334] "Generic (PLEG): container finished" podID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerID="cceb38c9f507e6c4fd34c4cca53a771be807a04a895235a4301c6341b1fac77c" exitCode=0 Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.742151 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" event={"ID":"160a0f00-a19e-4522-b8ea-2a14f87906e9","Type":"ContainerDied","Data":"cceb38c9f507e6c4fd34c4cca53a771be807a04a895235a4301c6341b1fac77c"} Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.742243 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" event={"ID":"160a0f00-a19e-4522-b8ea-2a14f87906e9","Type":"ContainerDied","Data":"a34cf5c231353408ee47634ef10ee450bdbb3cc3b1d50b38665b4fa21e3b0692"} Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.742257 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a34cf5c231353408ee47634ef10ee450bdbb3cc3b1d50b38665b4fa21e3b0692" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.743570 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.868651 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.946357 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-sb\") pod \"160a0f00-a19e-4522-b8ea-2a14f87906e9\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.946433 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-swift-storage-0\") pod \"160a0f00-a19e-4522-b8ea-2a14f87906e9\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.946508 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-nb\") pod \"160a0f00-a19e-4522-b8ea-2a14f87906e9\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.946699 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brv5f\" (UniqueName: \"kubernetes.io/projected/160a0f00-a19e-4522-b8ea-2a14f87906e9-kube-api-access-brv5f\") pod \"160a0f00-a19e-4522-b8ea-2a14f87906e9\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.946758 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-svc\") pod \"160a0f00-a19e-4522-b8ea-2a14f87906e9\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.946776 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-config\") pod \"160a0f00-a19e-4522-b8ea-2a14f87906e9\" (UID: \"160a0f00-a19e-4522-b8ea-2a14f87906e9\") " Jan 27 11:41:59 crc kubenswrapper[4775]: I0127 11:41:59.951325 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/160a0f00-a19e-4522-b8ea-2a14f87906e9-kube-api-access-brv5f" (OuterVolumeSpecName: "kube-api-access-brv5f") pod "160a0f00-a19e-4522-b8ea-2a14f87906e9" (UID: "160a0f00-a19e-4522-b8ea-2a14f87906e9"). InnerVolumeSpecName "kube-api-access-brv5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.006393 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "160a0f00-a19e-4522-b8ea-2a14f87906e9" (UID: "160a0f00-a19e-4522-b8ea-2a14f87906e9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.006412 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "160a0f00-a19e-4522-b8ea-2a14f87906e9" (UID: "160a0f00-a19e-4522-b8ea-2a14f87906e9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.011261 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "160a0f00-a19e-4522-b8ea-2a14f87906e9" (UID: "160a0f00-a19e-4522-b8ea-2a14f87906e9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.023006 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-config" (OuterVolumeSpecName: "config") pod "160a0f00-a19e-4522-b8ea-2a14f87906e9" (UID: "160a0f00-a19e-4522-b8ea-2a14f87906e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.034723 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "160a0f00-a19e-4522-b8ea-2a14f87906e9" (UID: "160a0f00-a19e-4522-b8ea-2a14f87906e9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.048848 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.049174 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.049184 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.049193 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brv5f\" (UniqueName: \"kubernetes.io/projected/160a0f00-a19e-4522-b8ea-2a14f87906e9-kube-api-access-brv5f\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.049203 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.049215 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/160a0f00-a19e-4522-b8ea-2a14f87906e9-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.242315 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66fc59ccbf-knrgp"] Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.750836 4775 generic.go:334] "Generic (PLEG): container finished" podID="f6c54a70-a562-4fef-b3fe-14e2a3029229" containerID="297bbe2bfb1cf7fdd07c7efe8142ca1d447763430eb9dc9194da892803da260f" exitCode=0 Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.750907 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" event={"ID":"f6c54a70-a562-4fef-b3fe-14e2a3029229","Type":"ContainerDied","Data":"297bbe2bfb1cf7fdd07c7efe8142ca1d447763430eb9dc9194da892803da260f"} Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.750938 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" event={"ID":"f6c54a70-a562-4fef-b3fe-14e2a3029229","Type":"ContainerStarted","Data":"4ba7f598c87a6deb7e40d06df6db66190f3761dae9c7e9b9f7699468d7620492"} Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.750942 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddd577785-dvccn" Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.946801 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-dvccn"] Jan 27 11:42:00 crc kubenswrapper[4775]: I0127 11:42:00.955212 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ddd577785-dvccn"] Jan 27 11:42:01 crc kubenswrapper[4775]: I0127 11:42:01.758318 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="160a0f00-a19e-4522-b8ea-2a14f87906e9" path="/var/lib/kubelet/pods/160a0f00-a19e-4522-b8ea-2a14f87906e9/volumes" Jan 27 11:42:01 crc kubenswrapper[4775]: I0127 11:42:01.771725 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" event={"ID":"f6c54a70-a562-4fef-b3fe-14e2a3029229","Type":"ContainerStarted","Data":"b359ce9b935c02b0ebbd3c738ede05c444329836ac0b725930307af83477dab3"} Jan 27 11:42:01 crc kubenswrapper[4775]: I0127 11:42:01.773079 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:42:01 crc kubenswrapper[4775]: I0127 11:42:01.796580 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" podStartSLOduration=2.796561309 podStartE2EDuration="2.796561309s" podCreationTimestamp="2026-01-27 11:41:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:42:01.794188344 +0000 UTC m=+1300.935786131" watchObservedRunningTime="2026-01-27 11:42:01.796561309 +0000 UTC m=+1300.938159086" Jan 27 11:42:02 crc kubenswrapper[4775]: I0127 11:42:02.101332 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 27 11:42:07 crc kubenswrapper[4775]: I0127 11:42:07.100914 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 27 11:42:07 crc kubenswrapper[4775]: I0127 11:42:07.128690 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 27 11:42:07 crc kubenswrapper[4775]: I0127 11:42:07.851244 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 27 11:42:09 crc kubenswrapper[4775]: I0127 11:42:09.759642 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66fc59ccbf-knrgp" Jan 27 11:42:09 crc kubenswrapper[4775]: I0127 11:42:09.841923 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-g4shv"] Jan 27 11:42:09 crc kubenswrapper[4775]: I0127 11:42:09.842600 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" podUID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerName="dnsmasq-dns" containerID="cri-o://7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7" gracePeriod=10 Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.302570 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.446771 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-config\") pod \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.446898 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-openstack-edpm-ipam\") pod \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.447028 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-svc\") pod \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.447110 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-nb\") pod \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.447203 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-swift-storage-0\") pod \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.447279 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-sb\") pod \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.447389 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp58g\" (UniqueName: \"kubernetes.io/projected/b5dacfeb-690d-4289-ae2d-0123e4435d4a-kube-api-access-rp58g\") pod \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\" (UID: \"b5dacfeb-690d-4289-ae2d-0123e4435d4a\") " Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.452775 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5dacfeb-690d-4289-ae2d-0123e4435d4a-kube-api-access-rp58g" (OuterVolumeSpecName: "kube-api-access-rp58g") pod "b5dacfeb-690d-4289-ae2d-0123e4435d4a" (UID: "b5dacfeb-690d-4289-ae2d-0123e4435d4a"). InnerVolumeSpecName "kube-api-access-rp58g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.507540 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "b5dacfeb-690d-4289-ae2d-0123e4435d4a" (UID: "b5dacfeb-690d-4289-ae2d-0123e4435d4a"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.508806 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b5dacfeb-690d-4289-ae2d-0123e4435d4a" (UID: "b5dacfeb-690d-4289-ae2d-0123e4435d4a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.509892 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b5dacfeb-690d-4289-ae2d-0123e4435d4a" (UID: "b5dacfeb-690d-4289-ae2d-0123e4435d4a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.511079 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b5dacfeb-690d-4289-ae2d-0123e4435d4a" (UID: "b5dacfeb-690d-4289-ae2d-0123e4435d4a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.511673 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-config" (OuterVolumeSpecName: "config") pod "b5dacfeb-690d-4289-ae2d-0123e4435d4a" (UID: "b5dacfeb-690d-4289-ae2d-0123e4435d4a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.515326 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b5dacfeb-690d-4289-ae2d-0123e4435d4a" (UID: "b5dacfeb-690d-4289-ae2d-0123e4435d4a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.549527 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.549570 4775 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.549583 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.549594 4775 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.549603 4775 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.549613 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp58g\" (UniqueName: \"kubernetes.io/projected/b5dacfeb-690d-4289-ae2d-0123e4435d4a-kube-api-access-rp58g\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.549623 4775 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5dacfeb-690d-4289-ae2d-0123e4435d4a-config\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.858311 4775 generic.go:334] "Generic (PLEG): container finished" podID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerID="7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7" exitCode=0 Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.858354 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" event={"ID":"b5dacfeb-690d-4289-ae2d-0123e4435d4a","Type":"ContainerDied","Data":"7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7"} Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.858379 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" event={"ID":"b5dacfeb-690d-4289-ae2d-0123e4435d4a","Type":"ContainerDied","Data":"df8f05a4df2923539bd706608b950d7b669b477c0ab0a1ec9d75d5d196841bbe"} Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.858383 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-668b55cdd7-g4shv" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.858400 4775 scope.go:117] "RemoveContainer" containerID="7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.883588 4775 scope.go:117] "RemoveContainer" containerID="65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.895472 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-g4shv"] Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.906529 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-668b55cdd7-g4shv"] Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.911878 4775 scope.go:117] "RemoveContainer" containerID="7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7" Jan 27 11:42:10 crc kubenswrapper[4775]: E0127 11:42:10.912241 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7\": container with ID starting with 7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7 not found: ID does not exist" containerID="7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.912293 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7"} err="failed to get container status \"7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7\": rpc error: code = NotFound desc = could not find container \"7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7\": container with ID starting with 7a438bc3f40713aefcaf983a0f6c393d11e8156ebc7bb73876320b02c9adc2d7 not found: ID does not exist" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.912328 4775 scope.go:117] "RemoveContainer" containerID="65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1" Jan 27 11:42:10 crc kubenswrapper[4775]: E0127 11:42:10.912655 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1\": container with ID starting with 65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1 not found: ID does not exist" containerID="65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1" Jan 27 11:42:10 crc kubenswrapper[4775]: I0127 11:42:10.912707 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1"} err="failed to get container status \"65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1\": rpc error: code = NotFound desc = could not find container \"65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1\": container with ID starting with 65a65c1265fba6bfeb06ceb992a68d0acd3188b8a1a7aebe6d552111438494e1 not found: ID does not exist" Jan 27 11:42:11 crc kubenswrapper[4775]: I0127 11:42:11.756668 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" path="/var/lib/kubelet/pods/b5dacfeb-690d-4289-ae2d-0123e4435d4a/volumes" Jan 27 11:42:15 crc kubenswrapper[4775]: I0127 11:42:15.915346 4775 generic.go:334] "Generic (PLEG): container finished" podID="6c46c48a-ba77-4494-bc4e-f463a4072952" containerID="36fdf46333226ef36a60bae5ba2567a2bed7c60248a3525289f10e463659609d" exitCode=0 Jan 27 11:42:15 crc kubenswrapper[4775]: I0127 11:42:15.915438 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6c46c48a-ba77-4494-bc4e-f463a4072952","Type":"ContainerDied","Data":"36fdf46333226ef36a60bae5ba2567a2bed7c60248a3525289f10e463659609d"} Jan 27 11:42:16 crc kubenswrapper[4775]: I0127 11:42:16.932530 4775 generic.go:334] "Generic (PLEG): container finished" podID="bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d" containerID="81eaee1031536e651f818c97c18baa543f83a2f9dd9e2588f54dca81587b369b" exitCode=0 Jan 27 11:42:16 crc kubenswrapper[4775]: I0127 11:42:16.932612 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d","Type":"ContainerDied","Data":"81eaee1031536e651f818c97c18baa543f83a2f9dd9e2588f54dca81587b369b"} Jan 27 11:42:16 crc kubenswrapper[4775]: I0127 11:42:16.936687 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6c46c48a-ba77-4494-bc4e-f463a4072952","Type":"ContainerStarted","Data":"5dd7f324981da4ba980d85e5abd1f53c99069809aa32f8868e097280dd75cdcf"} Jan 27 11:42:16 crc kubenswrapper[4775]: I0127 11:42:16.937223 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 27 11:42:17 crc kubenswrapper[4775]: I0127 11:42:17.946431 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d","Type":"ContainerStarted","Data":"dc35a311d40871e62d3663070ac78fd8947b5196bdda1cda6ea77c0a3b003d3a"} Jan 27 11:42:17 crc kubenswrapper[4775]: I0127 11:42:17.946953 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:42:17 crc kubenswrapper[4775]: I0127 11:42:17.970836 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.970813584 podStartE2EDuration="37.970813584s" podCreationTimestamp="2026-01-27 11:41:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:42:16.991972589 +0000 UTC m=+1316.133570386" watchObservedRunningTime="2026-01-27 11:42:17.970813584 +0000 UTC m=+1317.112411361" Jan 27 11:42:17 crc kubenswrapper[4775]: I0127 11:42:17.978106 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.978089924 podStartE2EDuration="37.978089924s" podCreationTimestamp="2026-01-27 11:41:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 11:42:17.970933008 +0000 UTC m=+1317.112530795" watchObservedRunningTime="2026-01-27 11:42:17.978089924 +0000 UTC m=+1317.119687691" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.835653 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm"] Jan 27 11:42:22 crc kubenswrapper[4775]: E0127 11:42:22.836714 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerName="init" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.836735 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerName="init" Jan 27 11:42:22 crc kubenswrapper[4775]: E0127 11:42:22.836749 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerName="dnsmasq-dns" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.836757 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerName="dnsmasq-dns" Jan 27 11:42:22 crc kubenswrapper[4775]: E0127 11:42:22.836773 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerName="dnsmasq-dns" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.836781 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerName="dnsmasq-dns" Jan 27 11:42:22 crc kubenswrapper[4775]: E0127 11:42:22.836809 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerName="init" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.836817 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerName="init" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.837031 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="160a0f00-a19e-4522-b8ea-2a14f87906e9" containerName="dnsmasq-dns" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.837071 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5dacfeb-690d-4289-ae2d-0123e4435d4a" containerName="dnsmasq-dns" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.838001 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.841944 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.842391 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.842601 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.843185 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:42:22 crc kubenswrapper[4775]: I0127 11:42:22.860677 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm"] Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.008397 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.008589 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2lkz\" (UniqueName: \"kubernetes.io/projected/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-kube-api-access-z2lkz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.008612 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.008642 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.110341 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.110519 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.110614 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2lkz\" (UniqueName: \"kubernetes.io/projected/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-kube-api-access-z2lkz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.110639 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.115930 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.116496 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.116869 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.129967 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2lkz\" (UniqueName: \"kubernetes.io/projected/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-kube-api-access-z2lkz\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.221542 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:23 crc kubenswrapper[4775]: I0127 11:42:23.822614 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm"] Jan 27 11:42:24 crc kubenswrapper[4775]: I0127 11:42:24.010486 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" event={"ID":"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0","Type":"ContainerStarted","Data":"c4e1ba8ab6414980818b6b9dd471ceedd3ec1e881e2746d87e08a9e20b38b722"} Jan 27 11:42:29 crc kubenswrapper[4775]: I0127 11:42:29.517872 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:42:29 crc kubenswrapper[4775]: I0127 11:42:29.518534 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:42:31 crc kubenswrapper[4775]: I0127 11:42:31.053207 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 27 11:42:31 crc kubenswrapper[4775]: I0127 11:42:31.228668 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 27 11:42:34 crc kubenswrapper[4775]: I0127 11:42:34.112645 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" event={"ID":"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0","Type":"ContainerStarted","Data":"4421d990b66ed1807f062cea9113ca31d1aee8e79868e64188e9b6567b378eb2"} Jan 27 11:42:39 crc kubenswrapper[4775]: I0127 11:42:39.147425 4775 scope.go:117] "RemoveContainer" containerID="8350f8998d5c2b4d38b2c37a8ef1d6f2931c0920b4400f0d9585d7221601d93d" Jan 27 11:42:39 crc kubenswrapper[4775]: I0127 11:42:39.170224 4775 scope.go:117] "RemoveContainer" containerID="ab7d80585c73c2935a1546f42ec8127d8f07e4ebfcf89fc16e590bf9f313fdc3" Jan 27 11:42:39 crc kubenswrapper[4775]: I0127 11:42:39.213793 4775 scope.go:117] "RemoveContainer" containerID="97eb2ae0d47bf6851995b105d37a65888384ea986fa2a3b3f741906dd431a2f6" Jan 27 11:42:46 crc kubenswrapper[4775]: I0127 11:42:46.235780 4775 generic.go:334] "Generic (PLEG): container finished" podID="ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" containerID="4421d990b66ed1807f062cea9113ca31d1aee8e79868e64188e9b6567b378eb2" exitCode=0 Jan 27 11:42:46 crc kubenswrapper[4775]: I0127 11:42:46.235912 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" event={"ID":"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0","Type":"ContainerDied","Data":"4421d990b66ed1807f062cea9113ca31d1aee8e79868e64188e9b6567b378eb2"} Jan 27 11:42:47 crc kubenswrapper[4775]: I0127 11:42:47.802085 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:47 crc kubenswrapper[4775]: I0127 11:42:47.991181 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2lkz\" (UniqueName: \"kubernetes.io/projected/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-kube-api-access-z2lkz\") pod \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " Jan 27 11:42:47 crc kubenswrapper[4775]: I0127 11:42:47.991243 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-ssh-key-openstack-edpm-ipam\") pod \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " Jan 27 11:42:47 crc kubenswrapper[4775]: I0127 11:42:47.991269 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-repo-setup-combined-ca-bundle\") pod \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " Jan 27 11:42:47 crc kubenswrapper[4775]: I0127 11:42:47.991307 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-inventory\") pod \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\" (UID: \"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0\") " Jan 27 11:42:47 crc kubenswrapper[4775]: I0127 11:42:47.998610 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-kube-api-access-z2lkz" (OuterVolumeSpecName: "kube-api-access-z2lkz") pod "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" (UID: "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0"). InnerVolumeSpecName "kube-api-access-z2lkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:42:47 crc kubenswrapper[4775]: I0127 11:42:47.998651 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" (UID: "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.020761 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-inventory" (OuterVolumeSpecName: "inventory") pod "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" (UID: "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.022518 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" (UID: "ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.093856 4775 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.093899 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.093913 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2lkz\" (UniqueName: \"kubernetes.io/projected/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-kube-api-access-z2lkz\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.093926 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.254711 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" event={"ID":"ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0","Type":"ContainerDied","Data":"c4e1ba8ab6414980818b6b9dd471ceedd3ec1e881e2746d87e08a9e20b38b722"} Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.254758 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4e1ba8ab6414980818b6b9dd471ceedd3ec1e881e2746d87e08a9e20b38b722" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.254760 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.343496 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd"] Jan 27 11:42:48 crc kubenswrapper[4775]: E0127 11:42:48.344048 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.344071 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.344307 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.345074 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.347682 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.348149 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.350764 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.350830 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.358221 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd"] Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.501149 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49cms\" (UniqueName: \"kubernetes.io/projected/e2226633-918b-423c-a329-bfd52943a1b0-kube-api-access-49cms\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.501338 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.501444 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.602739 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.602838 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.602970 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49cms\" (UniqueName: \"kubernetes.io/projected/e2226633-918b-423c-a329-bfd52943a1b0-kube-api-access-49cms\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.607671 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.609906 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.624247 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49cms\" (UniqueName: \"kubernetes.io/projected/e2226633-918b-423c-a329-bfd52943a1b0-kube-api-access-49cms\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-thhgd\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:48 crc kubenswrapper[4775]: I0127 11:42:48.667069 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:49 crc kubenswrapper[4775]: I0127 11:42:49.229423 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd"] Jan 27 11:42:49 crc kubenswrapper[4775]: I0127 11:42:49.268571 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" event={"ID":"e2226633-918b-423c-a329-bfd52943a1b0","Type":"ContainerStarted","Data":"4557cb8926c6adf3913c436efd184cd33deff2547d334fb69dcd315eba869c6b"} Jan 27 11:42:50 crc kubenswrapper[4775]: I0127 11:42:50.282193 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" event={"ID":"e2226633-918b-423c-a329-bfd52943a1b0","Type":"ContainerStarted","Data":"f21ddc8a8de4557b903d243bda6d4374bd676b4a1b43d17c511b753d4ca5bbb1"} Jan 27 11:42:50 crc kubenswrapper[4775]: I0127 11:42:50.310924 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" podStartSLOduration=1.786030725 podStartE2EDuration="2.310890528s" podCreationTimestamp="2026-01-27 11:42:48 +0000 UTC" firstStartedPulling="2026-01-27 11:42:49.227305316 +0000 UTC m=+1348.368903113" lastFinishedPulling="2026-01-27 11:42:49.752165089 +0000 UTC m=+1348.893762916" observedRunningTime="2026-01-27 11:42:50.303993849 +0000 UTC m=+1349.445591656" watchObservedRunningTime="2026-01-27 11:42:50.310890528 +0000 UTC m=+1349.452488315" Jan 27 11:42:53 crc kubenswrapper[4775]: I0127 11:42:53.310960 4775 generic.go:334] "Generic (PLEG): container finished" podID="e2226633-918b-423c-a329-bfd52943a1b0" containerID="f21ddc8a8de4557b903d243bda6d4374bd676b4a1b43d17c511b753d4ca5bbb1" exitCode=0 Jan 27 11:42:53 crc kubenswrapper[4775]: I0127 11:42:53.311050 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" event={"ID":"e2226633-918b-423c-a329-bfd52943a1b0","Type":"ContainerDied","Data":"f21ddc8a8de4557b903d243bda6d4374bd676b4a1b43d17c511b753d4ca5bbb1"} Jan 27 11:42:54 crc kubenswrapper[4775]: I0127 11:42:54.759252 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:54 crc kubenswrapper[4775]: I0127 11:42:54.922851 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49cms\" (UniqueName: \"kubernetes.io/projected/e2226633-918b-423c-a329-bfd52943a1b0-kube-api-access-49cms\") pod \"e2226633-918b-423c-a329-bfd52943a1b0\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " Jan 27 11:42:54 crc kubenswrapper[4775]: I0127 11:42:54.923406 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-inventory\") pod \"e2226633-918b-423c-a329-bfd52943a1b0\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " Jan 27 11:42:54 crc kubenswrapper[4775]: I0127 11:42:54.923511 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-ssh-key-openstack-edpm-ipam\") pod \"e2226633-918b-423c-a329-bfd52943a1b0\" (UID: \"e2226633-918b-423c-a329-bfd52943a1b0\") " Jan 27 11:42:54 crc kubenswrapper[4775]: I0127 11:42:54.930976 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2226633-918b-423c-a329-bfd52943a1b0-kube-api-access-49cms" (OuterVolumeSpecName: "kube-api-access-49cms") pod "e2226633-918b-423c-a329-bfd52943a1b0" (UID: "e2226633-918b-423c-a329-bfd52943a1b0"). InnerVolumeSpecName "kube-api-access-49cms". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:42:54 crc kubenswrapper[4775]: I0127 11:42:54.954682 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e2226633-918b-423c-a329-bfd52943a1b0" (UID: "e2226633-918b-423c-a329-bfd52943a1b0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:42:54 crc kubenswrapper[4775]: I0127 11:42:54.954904 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-inventory" (OuterVolumeSpecName: "inventory") pod "e2226633-918b-423c-a329-bfd52943a1b0" (UID: "e2226633-918b-423c-a329-bfd52943a1b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.026346 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.026384 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49cms\" (UniqueName: \"kubernetes.io/projected/e2226633-918b-423c-a329-bfd52943a1b0-kube-api-access-49cms\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.026402 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e2226633-918b-423c-a329-bfd52943a1b0-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.334961 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" event={"ID":"e2226633-918b-423c-a329-bfd52943a1b0","Type":"ContainerDied","Data":"4557cb8926c6adf3913c436efd184cd33deff2547d334fb69dcd315eba869c6b"} Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.335022 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4557cb8926c6adf3913c436efd184cd33deff2547d334fb69dcd315eba869c6b" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.335087 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-thhgd" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.414294 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw"] Jan 27 11:42:55 crc kubenswrapper[4775]: E0127 11:42:55.415016 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2226633-918b-423c-a329-bfd52943a1b0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.415045 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2226633-918b-423c-a329-bfd52943a1b0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.415294 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2226633-918b-423c-a329-bfd52943a1b0" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.416095 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.418401 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.418537 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.419441 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.421970 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.427405 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw"] Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.536786 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.536863 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.536894 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.537294 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf254\" (UniqueName: \"kubernetes.io/projected/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-kube-api-access-wf254\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.640064 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.640194 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.640264 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.640420 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf254\" (UniqueName: \"kubernetes.io/projected/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-kube-api-access-wf254\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.644364 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.645367 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.646374 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.660422 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf254\" (UniqueName: \"kubernetes.io/projected/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-kube-api-access-wf254\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:55 crc kubenswrapper[4775]: I0127 11:42:55.743323 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:42:56 crc kubenswrapper[4775]: I0127 11:42:56.465948 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw"] Jan 27 11:42:57 crc kubenswrapper[4775]: I0127 11:42:57.356839 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" event={"ID":"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4","Type":"ContainerStarted","Data":"f96b1a6dd88339ae4b48f43c6f6c0f5bb250530ab598f89eedb278600ec29d82"} Jan 27 11:42:57 crc kubenswrapper[4775]: I0127 11:42:57.357549 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" event={"ID":"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4","Type":"ContainerStarted","Data":"02aeb2413cb91178354fb34b5ec578f65317b693847156a1beffd5d7a10f9f91"} Jan 27 11:42:57 crc kubenswrapper[4775]: I0127 11:42:57.389774 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" podStartSLOduration=1.989230566 podStartE2EDuration="2.389746736s" podCreationTimestamp="2026-01-27 11:42:55 +0000 UTC" firstStartedPulling="2026-01-27 11:42:56.471675313 +0000 UTC m=+1355.613273090" lastFinishedPulling="2026-01-27 11:42:56.872191483 +0000 UTC m=+1356.013789260" observedRunningTime="2026-01-27 11:42:57.380140053 +0000 UTC m=+1356.521737890" watchObservedRunningTime="2026-01-27 11:42:57.389746736 +0000 UTC m=+1356.531344553" Jan 27 11:42:59 crc kubenswrapper[4775]: I0127 11:42:59.517789 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:42:59 crc kubenswrapper[4775]: I0127 11:42:59.518156 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.517353 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.517900 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.517960 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.518780 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cbdf6a049623d9cb774c7274e1659534afc097c8aad51e3cfeb95dc0922d2c51"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.518847 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://cbdf6a049623d9cb774c7274e1659534afc097c8aad51e3cfeb95dc0922d2c51" gracePeriod=600 Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.692370 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="cbdf6a049623d9cb774c7274e1659534afc097c8aad51e3cfeb95dc0922d2c51" exitCode=0 Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.692476 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"cbdf6a049623d9cb774c7274e1659534afc097c8aad51e3cfeb95dc0922d2c51"} Jan 27 11:43:29 crc kubenswrapper[4775]: I0127 11:43:29.692794 4775 scope.go:117] "RemoveContainer" containerID="26ce088382cdfd012bc2388482c813f595be3264b04c0cc4340c1bcb667afde7" Jan 27 11:43:30 crc kubenswrapper[4775]: I0127 11:43:30.703371 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94"} Jan 27 11:43:39 crc kubenswrapper[4775]: I0127 11:43:39.316627 4775 scope.go:117] "RemoveContainer" containerID="ae1cd59633ddddab66ae211c50fdfac95f828c364b9df14796c53c76293906ec" Jan 27 11:43:39 crc kubenswrapper[4775]: I0127 11:43:39.364578 4775 scope.go:117] "RemoveContainer" containerID="4cde95c13e106ae0baf2b7a5b06242a46ab07d950f57252253895801adba497a" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.172685 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft"] Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.175487 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.181017 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.182693 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.183103 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft"] Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.317398 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-secret-volume\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.317754 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74wv8\" (UniqueName: \"kubernetes.io/projected/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-kube-api-access-74wv8\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.317811 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-config-volume\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.419813 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-secret-volume\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.419895 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74wv8\" (UniqueName: \"kubernetes.io/projected/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-kube-api-access-74wv8\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.419962 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-config-volume\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.420903 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-config-volume\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.426345 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-secret-volume\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.439984 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74wv8\" (UniqueName: \"kubernetes.io/projected/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-kube-api-access-74wv8\") pod \"collect-profiles-29491905-z4lft\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.555766 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:00 crc kubenswrapper[4775]: I0127 11:45:00.994512 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft"] Jan 27 11:45:01 crc kubenswrapper[4775]: I0127 11:45:01.670106 4775 generic.go:334] "Generic (PLEG): container finished" podID="3bad6471-db4e-4c2b-ab76-7a9476cb3b9f" containerID="5c27e62a0c2d5741ef2ffd6f30031b9143ba1b815c078ea3acffebfbaf79467e" exitCode=0 Jan 27 11:45:01 crc kubenswrapper[4775]: I0127 11:45:01.670204 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" event={"ID":"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f","Type":"ContainerDied","Data":"5c27e62a0c2d5741ef2ffd6f30031b9143ba1b815c078ea3acffebfbaf79467e"} Jan 27 11:45:01 crc kubenswrapper[4775]: I0127 11:45:01.670507 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" event={"ID":"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f","Type":"ContainerStarted","Data":"a7fefa1620ee958e3d0d5848cd7261bd17b6011913e33e5a7718e96c1ea60245"} Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.002764 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.070290 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-config-volume\") pod \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.070363 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-secret-volume\") pod \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.070389 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74wv8\" (UniqueName: \"kubernetes.io/projected/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-kube-api-access-74wv8\") pod \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\" (UID: \"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f\") " Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.071262 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-config-volume" (OuterVolumeSpecName: "config-volume") pod "3bad6471-db4e-4c2b-ab76-7a9476cb3b9f" (UID: "3bad6471-db4e-4c2b-ab76-7a9476cb3b9f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.076785 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3bad6471-db4e-4c2b-ab76-7a9476cb3b9f" (UID: "3bad6471-db4e-4c2b-ab76-7a9476cb3b9f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.076867 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-kube-api-access-74wv8" (OuterVolumeSpecName: "kube-api-access-74wv8") pod "3bad6471-db4e-4c2b-ab76-7a9476cb3b9f" (UID: "3bad6471-db4e-4c2b-ab76-7a9476cb3b9f"). InnerVolumeSpecName "kube-api-access-74wv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.171964 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.171994 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.172003 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74wv8\" (UniqueName: \"kubernetes.io/projected/3bad6471-db4e-4c2b-ab76-7a9476cb3b9f-kube-api-access-74wv8\") on node \"crc\" DevicePath \"\"" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.686384 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" event={"ID":"3bad6471-db4e-4c2b-ab76-7a9476cb3b9f","Type":"ContainerDied","Data":"a7fefa1620ee958e3d0d5848cd7261bd17b6011913e33e5a7718e96c1ea60245"} Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.686753 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7fefa1620ee958e3d0d5848cd7261bd17b6011913e33e5a7718e96c1ea60245" Jan 27 11:45:03 crc kubenswrapper[4775]: I0127 11:45:03.686426 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491905-z4lft" Jan 27 11:45:29 crc kubenswrapper[4775]: I0127 11:45:29.517937 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:45:29 crc kubenswrapper[4775]: I0127 11:45:29.518556 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.561105 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jsp"] Jan 27 11:45:49 crc kubenswrapper[4775]: E0127 11:45:49.563203 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bad6471-db4e-4c2b-ab76-7a9476cb3b9f" containerName="collect-profiles" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.563237 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bad6471-db4e-4c2b-ab76-7a9476cb3b9f" containerName="collect-profiles" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.563493 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bad6471-db4e-4c2b-ab76-7a9476cb3b9f" containerName="collect-profiles" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.564946 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.571638 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jsp"] Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.622694 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-utilities\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.622977 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-catalog-content\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.623182 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh25k\" (UniqueName: \"kubernetes.io/projected/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-kube-api-access-zh25k\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.724625 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-catalog-content\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.724757 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh25k\" (UniqueName: \"kubernetes.io/projected/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-kube-api-access-zh25k\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.724803 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-utilities\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.725103 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-catalog-content\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.725141 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-utilities\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.743195 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh25k\" (UniqueName: \"kubernetes.io/projected/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-kube-api-access-zh25k\") pod \"redhat-marketplace-j4jsp\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:49 crc kubenswrapper[4775]: I0127 11:45:49.886123 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:50 crc kubenswrapper[4775]: I0127 11:45:50.325551 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jsp"] Jan 27 11:45:50 crc kubenswrapper[4775]: E0127 11:45:50.652356 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65bbbbe1_7f7d_439b_8a67_af6503dd0d59.slice/crio-conmon-36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c.scope\": RecentStats: unable to find data in memory cache]" Jan 27 11:45:51 crc kubenswrapper[4775]: I0127 11:45:51.159627 4775 generic.go:334] "Generic (PLEG): container finished" podID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerID="36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c" exitCode=0 Jan 27 11:45:51 crc kubenswrapper[4775]: I0127 11:45:51.159738 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jsp" event={"ID":"65bbbbe1-7f7d-439b-8a67-af6503dd0d59","Type":"ContainerDied","Data":"36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c"} Jan 27 11:45:51 crc kubenswrapper[4775]: I0127 11:45:51.160025 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jsp" event={"ID":"65bbbbe1-7f7d-439b-8a67-af6503dd0d59","Type":"ContainerStarted","Data":"e657cefbc3aaf1706e7be3f7e18b9398cb99f6e7fa4bb2b7577af101a745ab39"} Jan 27 11:45:51 crc kubenswrapper[4775]: I0127 11:45:51.162048 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 11:45:53 crc kubenswrapper[4775]: I0127 11:45:53.177994 4775 generic.go:334] "Generic (PLEG): container finished" podID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerID="b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc" exitCode=0 Jan 27 11:45:53 crc kubenswrapper[4775]: I0127 11:45:53.178033 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jsp" event={"ID":"65bbbbe1-7f7d-439b-8a67-af6503dd0d59","Type":"ContainerDied","Data":"b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc"} Jan 27 11:45:54 crc kubenswrapper[4775]: I0127 11:45:54.188366 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jsp" event={"ID":"65bbbbe1-7f7d-439b-8a67-af6503dd0d59","Type":"ContainerStarted","Data":"1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485"} Jan 27 11:45:54 crc kubenswrapper[4775]: I0127 11:45:54.216222 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j4jsp" podStartSLOduration=2.722288125 podStartE2EDuration="5.216200123s" podCreationTimestamp="2026-01-27 11:45:49 +0000 UTC" firstStartedPulling="2026-01-27 11:45:51.16159768 +0000 UTC m=+1530.303195457" lastFinishedPulling="2026-01-27 11:45:53.655509678 +0000 UTC m=+1532.797107455" observedRunningTime="2026-01-27 11:45:54.203553796 +0000 UTC m=+1533.345151583" watchObservedRunningTime="2026-01-27 11:45:54.216200123 +0000 UTC m=+1533.357797900" Jan 27 11:45:59 crc kubenswrapper[4775]: I0127 11:45:59.517437 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:45:59 crc kubenswrapper[4775]: I0127 11:45:59.518048 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:45:59 crc kubenswrapper[4775]: I0127 11:45:59.886922 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:59 crc kubenswrapper[4775]: I0127 11:45:59.887259 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:45:59 crc kubenswrapper[4775]: I0127 11:45:59.935523 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:46:00 crc kubenswrapper[4775]: I0127 11:46:00.282852 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:46:00 crc kubenswrapper[4775]: I0127 11:46:00.329250 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jsp"] Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.252759 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j4jsp" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="registry-server" containerID="cri-o://1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485" gracePeriod=2 Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.684778 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.860592 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-catalog-content\") pod \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.860746 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-utilities\") pod \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.860774 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh25k\" (UniqueName: \"kubernetes.io/projected/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-kube-api-access-zh25k\") pod \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\" (UID: \"65bbbbe1-7f7d-439b-8a67-af6503dd0d59\") " Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.862080 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-utilities" (OuterVolumeSpecName: "utilities") pod "65bbbbe1-7f7d-439b-8a67-af6503dd0d59" (UID: "65bbbbe1-7f7d-439b-8a67-af6503dd0d59"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.866863 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-kube-api-access-zh25k" (OuterVolumeSpecName: "kube-api-access-zh25k") pod "65bbbbe1-7f7d-439b-8a67-af6503dd0d59" (UID: "65bbbbe1-7f7d-439b-8a67-af6503dd0d59"). InnerVolumeSpecName "kube-api-access-zh25k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.888680 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65bbbbe1-7f7d-439b-8a67-af6503dd0d59" (UID: "65bbbbe1-7f7d-439b-8a67-af6503dd0d59"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.963335 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.963380 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh25k\" (UniqueName: \"kubernetes.io/projected/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-kube-api-access-zh25k\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:02 crc kubenswrapper[4775]: I0127 11:46:02.963394 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65bbbbe1-7f7d-439b-8a67-af6503dd0d59-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.262135 4775 generic.go:334] "Generic (PLEG): container finished" podID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerID="1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485" exitCode=0 Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.262194 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j4jsp" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.262245 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jsp" event={"ID":"65bbbbe1-7f7d-439b-8a67-af6503dd0d59","Type":"ContainerDied","Data":"1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485"} Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.262635 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j4jsp" event={"ID":"65bbbbe1-7f7d-439b-8a67-af6503dd0d59","Type":"ContainerDied","Data":"e657cefbc3aaf1706e7be3f7e18b9398cb99f6e7fa4bb2b7577af101a745ab39"} Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.262655 4775 scope.go:117] "RemoveContainer" containerID="1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.291400 4775 scope.go:117] "RemoveContainer" containerID="b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.292674 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jsp"] Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.300410 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j4jsp"] Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.324090 4775 scope.go:117] "RemoveContainer" containerID="36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.352829 4775 scope.go:117] "RemoveContainer" containerID="1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485" Jan 27 11:46:03 crc kubenswrapper[4775]: E0127 11:46:03.353296 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485\": container with ID starting with 1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485 not found: ID does not exist" containerID="1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.353351 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485"} err="failed to get container status \"1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485\": rpc error: code = NotFound desc = could not find container \"1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485\": container with ID starting with 1ce3cca8f4f31612e1c4a1f442044212505c52060bd7c23eaf593c309e650485 not found: ID does not exist" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.353384 4775 scope.go:117] "RemoveContainer" containerID="b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc" Jan 27 11:46:03 crc kubenswrapper[4775]: E0127 11:46:03.353805 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc\": container with ID starting with b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc not found: ID does not exist" containerID="b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.353847 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc"} err="failed to get container status \"b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc\": rpc error: code = NotFound desc = could not find container \"b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc\": container with ID starting with b514d1220503f575cda1f6ec14520cbc0514edd5da2bd8d8837b985aa7bab2bc not found: ID does not exist" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.353875 4775 scope.go:117] "RemoveContainer" containerID="36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c" Jan 27 11:46:03 crc kubenswrapper[4775]: E0127 11:46:03.354297 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c\": container with ID starting with 36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c not found: ID does not exist" containerID="36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.354336 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c"} err="failed to get container status \"36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c\": rpc error: code = NotFound desc = could not find container \"36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c\": container with ID starting with 36808a7120615f56f6addd2d9f07b3bd084febfa3d205ab3c6f8d07a333fdd2c not found: ID does not exist" Jan 27 11:46:03 crc kubenswrapper[4775]: I0127 11:46:03.755690 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" path="/var/lib/kubelet/pods/65bbbbe1-7f7d-439b-8a67-af6503dd0d59/volumes" Jan 27 11:46:16 crc kubenswrapper[4775]: I0127 11:46:16.957523 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fk7jm"] Jan 27 11:46:16 crc kubenswrapper[4775]: E0127 11:46:16.958668 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="extract-utilities" Jan 27 11:46:16 crc kubenswrapper[4775]: I0127 11:46:16.958688 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="extract-utilities" Jan 27 11:46:16 crc kubenswrapper[4775]: E0127 11:46:16.958703 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="registry-server" Jan 27 11:46:16 crc kubenswrapper[4775]: I0127 11:46:16.958713 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="registry-server" Jan 27 11:46:16 crc kubenswrapper[4775]: E0127 11:46:16.958738 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="extract-content" Jan 27 11:46:16 crc kubenswrapper[4775]: I0127 11:46:16.958930 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="extract-content" Jan 27 11:46:16 crc kubenswrapper[4775]: I0127 11:46:16.959180 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="65bbbbe1-7f7d-439b-8a67-af6503dd0d59" containerName="registry-server" Jan 27 11:46:16 crc kubenswrapper[4775]: I0127 11:46:16.960920 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:16 crc kubenswrapper[4775]: I0127 11:46:16.972546 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fk7jm"] Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.148892 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-utilities\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.149027 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-catalog-content\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.149086 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqv2p\" (UniqueName: \"kubernetes.io/projected/01280896-28bf-48e8-82b4-a28e65351bf8-kube-api-access-fqv2p\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.251044 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-utilities\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.251113 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-catalog-content\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.251147 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqv2p\" (UniqueName: \"kubernetes.io/projected/01280896-28bf-48e8-82b4-a28e65351bf8-kube-api-access-fqv2p\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.251644 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-utilities\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.251681 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-catalog-content\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.274079 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqv2p\" (UniqueName: \"kubernetes.io/projected/01280896-28bf-48e8-82b4-a28e65351bf8-kube-api-access-fqv2p\") pod \"certified-operators-fk7jm\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.284985 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:17 crc kubenswrapper[4775]: I0127 11:46:17.810141 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fk7jm"] Jan 27 11:46:17 crc kubenswrapper[4775]: W0127 11:46:17.818305 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01280896_28bf_48e8_82b4_a28e65351bf8.slice/crio-5be5cf334b570a79de7b28aa19fe919a8ae19f7feff7226b4144828d643a2611 WatchSource:0}: Error finding container 5be5cf334b570a79de7b28aa19fe919a8ae19f7feff7226b4144828d643a2611: Status 404 returned error can't find the container with id 5be5cf334b570a79de7b28aa19fe919a8ae19f7feff7226b4144828d643a2611 Jan 27 11:46:18 crc kubenswrapper[4775]: I0127 11:46:18.405395 4775 generic.go:334] "Generic (PLEG): container finished" podID="01280896-28bf-48e8-82b4-a28e65351bf8" containerID="4d1fd7a3a7b7e1dd9235db1eb044fef085283a18452fdab7e8dd5a79d836ec7a" exitCode=0 Jan 27 11:46:18 crc kubenswrapper[4775]: I0127 11:46:18.405469 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk7jm" event={"ID":"01280896-28bf-48e8-82b4-a28e65351bf8","Type":"ContainerDied","Data":"4d1fd7a3a7b7e1dd9235db1eb044fef085283a18452fdab7e8dd5a79d836ec7a"} Jan 27 11:46:18 crc kubenswrapper[4775]: I0127 11:46:18.405516 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk7jm" event={"ID":"01280896-28bf-48e8-82b4-a28e65351bf8","Type":"ContainerStarted","Data":"5be5cf334b570a79de7b28aa19fe919a8ae19f7feff7226b4144828d643a2611"} Jan 27 11:46:20 crc kubenswrapper[4775]: I0127 11:46:20.422675 4775 generic.go:334] "Generic (PLEG): container finished" podID="01280896-28bf-48e8-82b4-a28e65351bf8" containerID="00b37ab51003ac4db2057e78eaff4936b8c7c44607d57e697ef8ff6a716893b0" exitCode=0 Jan 27 11:46:20 crc kubenswrapper[4775]: I0127 11:46:20.422868 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk7jm" event={"ID":"01280896-28bf-48e8-82b4-a28e65351bf8","Type":"ContainerDied","Data":"00b37ab51003ac4db2057e78eaff4936b8c7c44607d57e697ef8ff6a716893b0"} Jan 27 11:46:24 crc kubenswrapper[4775]: I0127 11:46:24.458558 4775 generic.go:334] "Generic (PLEG): container finished" podID="ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" containerID="f96b1a6dd88339ae4b48f43c6f6c0f5bb250530ab598f89eedb278600ec29d82" exitCode=0 Jan 27 11:46:24 crc kubenswrapper[4775]: I0127 11:46:24.458650 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" event={"ID":"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4","Type":"ContainerDied","Data":"f96b1a6dd88339ae4b48f43c6f6c0f5bb250530ab598f89eedb278600ec29d82"} Jan 27 11:46:24 crc kubenswrapper[4775]: I0127 11:46:24.461830 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk7jm" event={"ID":"01280896-28bf-48e8-82b4-a28e65351bf8","Type":"ContainerStarted","Data":"b1703fe518e4f131b25d2b70f4085458d406a55f1a26b53ec19be385abe3ad31"} Jan 27 11:46:24 crc kubenswrapper[4775]: I0127 11:46:24.500462 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fk7jm" podStartSLOduration=3.087373317 podStartE2EDuration="8.500430455s" podCreationTimestamp="2026-01-27 11:46:16 +0000 UTC" firstStartedPulling="2026-01-27 11:46:18.407838712 +0000 UTC m=+1557.549436489" lastFinishedPulling="2026-01-27 11:46:23.82089582 +0000 UTC m=+1562.962493627" observedRunningTime="2026-01-27 11:46:24.49550416 +0000 UTC m=+1563.637101947" watchObservedRunningTime="2026-01-27 11:46:24.500430455 +0000 UTC m=+1563.642028232" Jan 27 11:46:25 crc kubenswrapper[4775]: I0127 11:46:25.868258 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.033875 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-z98pk"] Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.042318 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-inventory\") pod \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.042426 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-bootstrap-combined-ca-bundle\") pod \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.042487 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf254\" (UniqueName: \"kubernetes.io/projected/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-kube-api-access-wf254\") pod \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.042548 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-ssh-key-openstack-edpm-ipam\") pod \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\" (UID: \"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4\") " Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.044756 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-z98pk"] Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.049799 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" (UID: "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.049811 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-kube-api-access-wf254" (OuterVolumeSpecName: "kube-api-access-wf254") pod "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" (UID: "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4"). InnerVolumeSpecName "kube-api-access-wf254". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.072898 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-inventory" (OuterVolumeSpecName: "inventory") pod "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" (UID: "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.076756 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" (UID: "ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.144246 4775 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.144278 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf254\" (UniqueName: \"kubernetes.io/projected/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-kube-api-access-wf254\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.144290 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.144305 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.476740 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" event={"ID":"ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4","Type":"ContainerDied","Data":"02aeb2413cb91178354fb34b5ec578f65317b693847156a1beffd5d7a10f9f91"} Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.476781 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02aeb2413cb91178354fb34b5ec578f65317b693847156a1beffd5d7a10f9f91" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.476844 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.565416 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh"] Jan 27 11:46:26 crc kubenswrapper[4775]: E0127 11:46:26.566118 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.566211 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.566526 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.567389 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.571768 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.571861 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.571901 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.571932 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.586778 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh"] Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.755561 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfdjz\" (UniqueName: \"kubernetes.io/projected/e018489b-9445-4afb-8e4c-e9d52a6781d7-kube-api-access-zfdjz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.755865 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.755938 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.857242 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.857383 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfdjz\" (UniqueName: \"kubernetes.io/projected/e018489b-9445-4afb-8e4c-e9d52a6781d7-kube-api-access-zfdjz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.858150 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.861820 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.862025 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.873351 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfdjz\" (UniqueName: \"kubernetes.io/projected/e018489b-9445-4afb-8e4c-e9d52a6781d7-kube-api-access-zfdjz\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-wskgh\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:26 crc kubenswrapper[4775]: I0127 11:46:26.893281 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:46:27 crc kubenswrapper[4775]: I0127 11:46:27.286074 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:27 crc kubenswrapper[4775]: I0127 11:46:27.286383 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:27 crc kubenswrapper[4775]: I0127 11:46:27.334247 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:27 crc kubenswrapper[4775]: I0127 11:46:27.455334 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh"] Jan 27 11:46:27 crc kubenswrapper[4775]: I0127 11:46:27.502314 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" event={"ID":"e018489b-9445-4afb-8e4c-e9d52a6781d7","Type":"ContainerStarted","Data":"3217b89f2dac713472e7b3a18905a4d7a31e80c6ce8d152dbe6348dc51a98d1a"} Jan 27 11:46:27 crc kubenswrapper[4775]: I0127 11:46:27.753903 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f53ed1d7-9aa1-49d4-8396-c3487e0465d6" path="/var/lib/kubelet/pods/f53ed1d7-9aa1-49d4-8396-c3487e0465d6/volumes" Jan 27 11:46:28 crc kubenswrapper[4775]: I0127 11:46:28.511292 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" event={"ID":"e018489b-9445-4afb-8e4c-e9d52a6781d7","Type":"ContainerStarted","Data":"68fab69969ac252051443544579383fd831d8133c32cad9d9c4c67e6e0fe0911"} Jan 27 11:46:28 crc kubenswrapper[4775]: I0127 11:46:28.535071 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" podStartSLOduration=1.835598302 podStartE2EDuration="2.535046333s" podCreationTimestamp="2026-01-27 11:46:26 +0000 UTC" firstStartedPulling="2026-01-27 11:46:27.467603772 +0000 UTC m=+1566.609201549" lastFinishedPulling="2026-01-27 11:46:28.167051803 +0000 UTC m=+1567.308649580" observedRunningTime="2026-01-27 11:46:28.526433307 +0000 UTC m=+1567.668031104" watchObservedRunningTime="2026-01-27 11:46:28.535046333 +0000 UTC m=+1567.676644130" Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.041704 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-m5645"] Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.050991 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-m5645"] Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.517721 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.517778 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.517818 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.518546 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.518597 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" gracePeriod=600 Jan 27 11:46:29 crc kubenswrapper[4775]: E0127 11:46:29.659952 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:46:29 crc kubenswrapper[4775]: I0127 11:46:29.755015 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62f5bc59-5fa8-42f4-bc7b-85827a01cc9d" path="/var/lib/kubelet/pods/62f5bc59-5fa8-42f4-bc7b-85827a01cc9d/volumes" Jan 27 11:46:30 crc kubenswrapper[4775]: I0127 11:46:30.530979 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" exitCode=0 Jan 27 11:46:30 crc kubenswrapper[4775]: I0127 11:46:30.531048 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94"} Jan 27 11:46:30 crc kubenswrapper[4775]: I0127 11:46:30.531347 4775 scope.go:117] "RemoveContainer" containerID="cbdf6a049623d9cb774c7274e1659534afc097c8aad51e3cfeb95dc0922d2c51" Jan 27 11:46:30 crc kubenswrapper[4775]: I0127 11:46:30.532082 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:46:30 crc kubenswrapper[4775]: E0127 11:46:30.532376 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:46:33 crc kubenswrapper[4775]: I0127 11:46:33.030761 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-2856-account-create-update-zgmqw"] Jan 27 11:46:33 crc kubenswrapper[4775]: I0127 11:46:33.040580 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-599fs"] Jan 27 11:46:33 crc kubenswrapper[4775]: I0127 11:46:33.051032 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-599fs"] Jan 27 11:46:33 crc kubenswrapper[4775]: I0127 11:46:33.059789 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-2856-account-create-update-zgmqw"] Jan 27 11:46:33 crc kubenswrapper[4775]: I0127 11:46:33.759023 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bbde61d-aca8-4b36-8896-9c0db3e081be" path="/var/lib/kubelet/pods/0bbde61d-aca8-4b36-8896-9c0db3e081be/volumes" Jan 27 11:46:33 crc kubenswrapper[4775]: I0127 11:46:33.760393 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3cd1d9e-b735-4f90-b92a-00353e576e10" path="/var/lib/kubelet/pods/c3cd1d9e-b735-4f90-b92a-00353e576e10/volumes" Jan 27 11:46:34 crc kubenswrapper[4775]: I0127 11:46:34.034248 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8d1f-account-create-update-gbh56"] Jan 27 11:46:34 crc kubenswrapper[4775]: I0127 11:46:34.044358 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9763-account-create-update-dms9b"] Jan 27 11:46:34 crc kubenswrapper[4775]: I0127 11:46:34.053385 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8d1f-account-create-update-gbh56"] Jan 27 11:46:34 crc kubenswrapper[4775]: I0127 11:46:34.066612 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9763-account-create-update-dms9b"] Jan 27 11:46:35 crc kubenswrapper[4775]: I0127 11:46:35.033377 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-vvxg4"] Jan 27 11:46:35 crc kubenswrapper[4775]: I0127 11:46:35.041164 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-vvxg4"] Jan 27 11:46:35 crc kubenswrapper[4775]: I0127 11:46:35.759714 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24d04bb6-3007-42c5-9753-746a6eeb7d1c" path="/var/lib/kubelet/pods/24d04bb6-3007-42c5-9753-746a6eeb7d1c/volumes" Jan 27 11:46:35 crc kubenswrapper[4775]: I0127 11:46:35.760949 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f208e1de-fc0e-4deb-a093-d27604b3931f" path="/var/lib/kubelet/pods/f208e1de-fc0e-4deb-a093-d27604b3931f/volumes" Jan 27 11:46:35 crc kubenswrapper[4775]: I0127 11:46:35.762298 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f577e755-a863-4fea-9288-6cd30168b405" path="/var/lib/kubelet/pods/f577e755-a863-4fea-9288-6cd30168b405/volumes" Jan 27 11:46:37 crc kubenswrapper[4775]: I0127 11:46:37.343440 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:37 crc kubenswrapper[4775]: I0127 11:46:37.387093 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fk7jm"] Jan 27 11:46:37 crc kubenswrapper[4775]: I0127 11:46:37.620840 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fk7jm" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="registry-server" containerID="cri-o://b1703fe518e4f131b25d2b70f4085458d406a55f1a26b53ec19be385abe3ad31" gracePeriod=2 Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.634234 4775 generic.go:334] "Generic (PLEG): container finished" podID="01280896-28bf-48e8-82b4-a28e65351bf8" containerID="b1703fe518e4f131b25d2b70f4085458d406a55f1a26b53ec19be385abe3ad31" exitCode=0 Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.634306 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk7jm" event={"ID":"01280896-28bf-48e8-82b4-a28e65351bf8","Type":"ContainerDied","Data":"b1703fe518e4f131b25d2b70f4085458d406a55f1a26b53ec19be385abe3ad31"} Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.801272 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.978676 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-catalog-content\") pod \"01280896-28bf-48e8-82b4-a28e65351bf8\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.979367 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqv2p\" (UniqueName: \"kubernetes.io/projected/01280896-28bf-48e8-82b4-a28e65351bf8-kube-api-access-fqv2p\") pod \"01280896-28bf-48e8-82b4-a28e65351bf8\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.980882 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-utilities\") pod \"01280896-28bf-48e8-82b4-a28e65351bf8\" (UID: \"01280896-28bf-48e8-82b4-a28e65351bf8\") " Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.981824 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-utilities" (OuterVolumeSpecName: "utilities") pod "01280896-28bf-48e8-82b4-a28e65351bf8" (UID: "01280896-28bf-48e8-82b4-a28e65351bf8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:46:38 crc kubenswrapper[4775]: I0127 11:46:38.986573 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01280896-28bf-48e8-82b4-a28e65351bf8-kube-api-access-fqv2p" (OuterVolumeSpecName: "kube-api-access-fqv2p") pod "01280896-28bf-48e8-82b4-a28e65351bf8" (UID: "01280896-28bf-48e8-82b4-a28e65351bf8"). InnerVolumeSpecName "kube-api-access-fqv2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.028230 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01280896-28bf-48e8-82b4-a28e65351bf8" (UID: "01280896-28bf-48e8-82b4-a28e65351bf8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.084005 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqv2p\" (UniqueName: \"kubernetes.io/projected/01280896-28bf-48e8-82b4-a28e65351bf8-kube-api-access-fqv2p\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.084059 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.084079 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01280896-28bf-48e8-82b4-a28e65351bf8-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.533883 4775 scope.go:117] "RemoveContainer" containerID="7b4d6f31c9c98ba053d3d16dc4c80a54a02b6f5c6992d3e72b61e7cfc30b58ab" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.571579 4775 scope.go:117] "RemoveContainer" containerID="8afc04127ae5dac867cf7f5463a37db08396e7d83dca005132a5f83a2ea9896d" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.643756 4775 scope.go:117] "RemoveContainer" containerID="5f66195a27d4424e7e63c73f2e82e91d3646c082443a037a0bda03b3cefa73cf" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.653222 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fk7jm" event={"ID":"01280896-28bf-48e8-82b4-a28e65351bf8","Type":"ContainerDied","Data":"5be5cf334b570a79de7b28aa19fe919a8ae19f7feff7226b4144828d643a2611"} Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.653292 4775 scope.go:117] "RemoveContainer" containerID="b1703fe518e4f131b25d2b70f4085458d406a55f1a26b53ec19be385abe3ad31" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.653515 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fk7jm" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.671420 4775 scope.go:117] "RemoveContainer" containerID="0a7460a95945a93f0c4a50f297f4b7fe68e0f3ea9e0d32b93ec9b5db49741c68" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.693769 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fk7jm"] Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.701663 4775 scope.go:117] "RemoveContainer" containerID="00b37ab51003ac4db2057e78eaff4936b8c7c44607d57e697ef8ff6a716893b0" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.702124 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fk7jm"] Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.720875 4775 scope.go:117] "RemoveContainer" containerID="b680860e2593d7ee3bb455ce65bb0c417d6d9c265106d69c11a3f6d5c337e06f" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.754813 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" path="/var/lib/kubelet/pods/01280896-28bf-48e8-82b4-a28e65351bf8/volumes" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.769789 4775 scope.go:117] "RemoveContainer" containerID="4d1fd7a3a7b7e1dd9235db1eb044fef085283a18452fdab7e8dd5a79d836ec7a" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.791689 4775 scope.go:117] "RemoveContainer" containerID="a7104b478c78a88190582a427d9e420a454c991055e729bc5832a8bcf5f244d9" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.843694 4775 scope.go:117] "RemoveContainer" containerID="86a1bff7b31394585d429293e2cf406a868ddfdf2d92e362c2ef607e10a9665a" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.864529 4775 scope.go:117] "RemoveContainer" containerID="ada66549c4f1e296080bb921b685b5ff52027670033c232a5715f71a31d45760" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.880561 4775 scope.go:117] "RemoveContainer" containerID="f260a904e6d20da11c12e2ef276cb0dd004088b3878643538e823bf35507b886" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.902204 4775 scope.go:117] "RemoveContainer" containerID="b726600d4c126579c1604f5195dde261fec3e367b813eba5f4b69473ff9e521c" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.940409 4775 scope.go:117] "RemoveContainer" containerID="cceb38c9f507e6c4fd34c4cca53a771be807a04a895235a4301c6341b1fac77c" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.956905 4775 scope.go:117] "RemoveContainer" containerID="1b501489d56c612c1213704c15f0b24ba5a096453c8a67466274eb0e4a0ced9d" Jan 27 11:46:39 crc kubenswrapper[4775]: I0127 11:46:39.979761 4775 scope.go:117] "RemoveContainer" containerID="3b69c86674facf450b3f60f67ef713811fbc5e3c9c84c0321b56c4b870189985" Jan 27 11:46:40 crc kubenswrapper[4775]: I0127 11:46:40.017994 4775 scope.go:117] "RemoveContainer" containerID="25331384137e51f62cf5d50c569a969c7570079d48885c44122b0593afae0e9e" Jan 27 11:46:40 crc kubenswrapper[4775]: I0127 11:46:40.102685 4775 scope.go:117] "RemoveContainer" containerID="c884b91cb6533e39556fed9ba7b6556eae261c0e6e3cc932634018d329df984d" Jan 27 11:46:45 crc kubenswrapper[4775]: I0127 11:46:45.744960 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:46:45 crc kubenswrapper[4775]: E0127 11:46:45.745543 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:46:58 crc kubenswrapper[4775]: I0127 11:46:58.745297 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:46:58 crc kubenswrapper[4775]: E0127 11:46:58.746191 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.055274 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-fcvx2"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.066598 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-fcvx2"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.077865 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-62xpg"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.090030 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-x8mb5"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.098349 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-62xpg"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.105765 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a920-account-create-update-7gdg6"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.114834 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b4bd-account-create-update-lztz8"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.124633 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b21b-account-create-update-grvbp"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.136127 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a920-account-create-update-7gdg6"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.148734 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b4bd-account-create-update-lztz8"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.158362 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-x8mb5"] Jan 27 11:47:00 crc kubenswrapper[4775]: I0127 11:47:00.168143 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b21b-account-create-update-grvbp"] Jan 27 11:47:01 crc kubenswrapper[4775]: I0127 11:47:01.757243 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="066d45f9-5f72-4b81-8166-0238863b8789" path="/var/lib/kubelet/pods/066d45f9-5f72-4b81-8166-0238863b8789/volumes" Jan 27 11:47:01 crc kubenswrapper[4775]: I0127 11:47:01.758473 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58a046ea-e8eb-40ed-a64d-b382e0a2f331" path="/var/lib/kubelet/pods/58a046ea-e8eb-40ed-a64d-b382e0a2f331/volumes" Jan 27 11:47:01 crc kubenswrapper[4775]: I0127 11:47:01.759266 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90455f95-bcc6-4229-948c-599c91a08b2a" path="/var/lib/kubelet/pods/90455f95-bcc6-4229-948c-599c91a08b2a/volumes" Jan 27 11:47:01 crc kubenswrapper[4775]: I0127 11:47:01.759996 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2" path="/var/lib/kubelet/pods/a7d7f9ca-2e9c-4379-bee2-38cf61ed6cb2/volumes" Jan 27 11:47:01 crc kubenswrapper[4775]: I0127 11:47:01.761516 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c495d390-f7ca-4867-b334-263c03f6b211" path="/var/lib/kubelet/pods/c495d390-f7ca-4867-b334-263c03f6b211/volumes" Jan 27 11:47:01 crc kubenswrapper[4775]: I0127 11:47:01.762289 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8a7ac2f-36f7-49c5-96f9-6f8b19809b07" path="/var/lib/kubelet/pods/d8a7ac2f-36f7-49c5-96f9-6f8b19809b07/volumes" Jan 27 11:47:05 crc kubenswrapper[4775]: I0127 11:47:05.035211 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-kc6bw"] Jan 27 11:47:05 crc kubenswrapper[4775]: I0127 11:47:05.043224 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-kc6bw"] Jan 27 11:47:05 crc kubenswrapper[4775]: I0127 11:47:05.755370 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71de6180-54da-4c3b-8aea-73a2ccfd936a" path="/var/lib/kubelet/pods/71de6180-54da-4c3b-8aea-73a2ccfd936a/volumes" Jan 27 11:47:13 crc kubenswrapper[4775]: I0127 11:47:13.745019 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:47:13 crc kubenswrapper[4775]: E0127 11:47:13.746000 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:47:24 crc kubenswrapper[4775]: I0127 11:47:24.745581 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:47:24 crc kubenswrapper[4775]: E0127 11:47:24.746615 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.555734 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pbxcd"] Jan 27 11:47:25 crc kubenswrapper[4775]: E0127 11:47:25.556213 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="extract-content" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.556234 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="extract-content" Jan 27 11:47:25 crc kubenswrapper[4775]: E0127 11:47:25.556250 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="extract-utilities" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.556258 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="extract-utilities" Jan 27 11:47:25 crc kubenswrapper[4775]: E0127 11:47:25.556273 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="registry-server" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.556281 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="registry-server" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.556519 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="01280896-28bf-48e8-82b4-a28e65351bf8" containerName="registry-server" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.557915 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.566343 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbxcd"] Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.582900 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-utilities\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.582952 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8ldx\" (UniqueName: \"kubernetes.io/projected/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-kube-api-access-z8ldx\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.583019 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-catalog-content\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.684419 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-utilities\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.684491 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8ldx\" (UniqueName: \"kubernetes.io/projected/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-kube-api-access-z8ldx\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.684530 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-catalog-content\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.685035 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-utilities\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.685072 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-catalog-content\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.702097 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8ldx\" (UniqueName: \"kubernetes.io/projected/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-kube-api-access-z8ldx\") pod \"redhat-operators-pbxcd\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:25 crc kubenswrapper[4775]: I0127 11:47:25.883073 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:26 crc kubenswrapper[4775]: I0127 11:47:26.353663 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pbxcd"] Jan 27 11:47:27 crc kubenswrapper[4775]: I0127 11:47:27.115149 4775 generic.go:334] "Generic (PLEG): container finished" podID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerID="d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0" exitCode=0 Jan 27 11:47:27 crc kubenswrapper[4775]: I0127 11:47:27.115230 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbxcd" event={"ID":"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f","Type":"ContainerDied","Data":"d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0"} Jan 27 11:47:27 crc kubenswrapper[4775]: I0127 11:47:27.115561 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbxcd" event={"ID":"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f","Type":"ContainerStarted","Data":"c86625840bf41ea9181153767b3d3a82a3b86875abf42fbf6fc07a9e94beac5b"} Jan 27 11:47:28 crc kubenswrapper[4775]: I0127 11:47:28.124833 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbxcd" event={"ID":"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f","Type":"ContainerStarted","Data":"533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4"} Jan 27 11:47:31 crc kubenswrapper[4775]: I0127 11:47:31.163535 4775 generic.go:334] "Generic (PLEG): container finished" podID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerID="533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4" exitCode=0 Jan 27 11:47:31 crc kubenswrapper[4775]: I0127 11:47:31.163633 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbxcd" event={"ID":"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f","Type":"ContainerDied","Data":"533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4"} Jan 27 11:47:32 crc kubenswrapper[4775]: I0127 11:47:32.175510 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbxcd" event={"ID":"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f","Type":"ContainerStarted","Data":"e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1"} Jan 27 11:47:32 crc kubenswrapper[4775]: I0127 11:47:32.204185 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pbxcd" podStartSLOduration=2.388212512 podStartE2EDuration="7.204163302s" podCreationTimestamp="2026-01-27 11:47:25 +0000 UTC" firstStartedPulling="2026-01-27 11:47:27.117476098 +0000 UTC m=+1626.259073895" lastFinishedPulling="2026-01-27 11:47:31.933426908 +0000 UTC m=+1631.075024685" observedRunningTime="2026-01-27 11:47:32.195011613 +0000 UTC m=+1631.336609410" watchObservedRunningTime="2026-01-27 11:47:32.204163302 +0000 UTC m=+1631.345761089" Jan 27 11:47:35 crc kubenswrapper[4775]: I0127 11:47:35.884271 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:35 crc kubenswrapper[4775]: I0127 11:47:35.884950 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:36 crc kubenswrapper[4775]: I0127 11:47:36.036286 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-sd44h"] Jan 27 11:47:36 crc kubenswrapper[4775]: I0127 11:47:36.044623 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-sd44h"] Jan 27 11:47:36 crc kubenswrapper[4775]: I0127 11:47:36.924753 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pbxcd" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="registry-server" probeResult="failure" output=< Jan 27 11:47:36 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 11:47:36 crc kubenswrapper[4775]: > Jan 27 11:47:37 crc kubenswrapper[4775]: I0127 11:47:37.027207 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-99pzl"] Jan 27 11:47:37 crc kubenswrapper[4775]: I0127 11:47:37.034814 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-99pzl"] Jan 27 11:47:37 crc kubenswrapper[4775]: I0127 11:47:37.744549 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:47:37 crc kubenswrapper[4775]: E0127 11:47:37.744847 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:47:37 crc kubenswrapper[4775]: I0127 11:47:37.755226 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73aaf8f0-0380-4eff-875b-90da115dba37" path="/var/lib/kubelet/pods/73aaf8f0-0380-4eff-875b-90da115dba37/volumes" Jan 27 11:47:37 crc kubenswrapper[4775]: I0127 11:47:37.755809 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca5aab7c-3b7a-4996-82f5-478d4100bb6c" path="/var/lib/kubelet/pods/ca5aab7c-3b7a-4996-82f5-478d4100bb6c/volumes" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.376184 4775 scope.go:117] "RemoveContainer" containerID="adceaeb3830c50c53d8853f905ea7baa2cdfc916451d3151ad053ea8bc41ca42" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.434816 4775 scope.go:117] "RemoveContainer" containerID="3fb6dba1ef6aef5504b2fb4bb7d21e98e86e3a8d11057b678b01d97ea7febc53" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.469393 4775 scope.go:117] "RemoveContainer" containerID="7c55ba28687b09e9f043ff5197811f82e94f5b15d3585bb9d84c0255945f85f2" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.489597 4775 scope.go:117] "RemoveContainer" containerID="ac331de51381335c4691ae4e98de7332a3c5743a5d6c666d5f05ad5b3c6fd004" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.542845 4775 scope.go:117] "RemoveContainer" containerID="876d516959295d7e0db711e27a3980ced858832560adced1e7a9b9f0d697bf7f" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.586930 4775 scope.go:117] "RemoveContainer" containerID="17411cc983dfc73db04ce363359c284ba977fc80d7b5112232e0f918ef68f140" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.610140 4775 scope.go:117] "RemoveContainer" containerID="9f638d9da6983bb9f837a053db11c7b530800ce81cdfc56efc5cba5e158a333e" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.679887 4775 scope.go:117] "RemoveContainer" containerID="60c3929eb191aa5a40f70277344a8ffb5cea8ddde6e12141b0847fb62fc4d0e9" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.699168 4775 scope.go:117] "RemoveContainer" containerID="680998a678e870e249e755477f30b2a4504f760bab8f79f38f76f47fa33c362f" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.717212 4775 scope.go:117] "RemoveContainer" containerID="ba2616ca5d5b886e0ddfe23c893276ccb71fe9923291902da4fa96d4180b8ef5" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.739711 4775 scope.go:117] "RemoveContainer" containerID="8e66e5156f741145dc91fb1f4f5c4dcef2ff5bbcecc942be3a86ad151ce0efd1" Jan 27 11:47:40 crc kubenswrapper[4775]: I0127 11:47:40.766938 4775 scope.go:117] "RemoveContainer" containerID="99a5cb170850c0b63e27c950fae2217adb226000e7879b0d85d00d895a615bdf" Jan 27 11:47:45 crc kubenswrapper[4775]: I0127 11:47:45.936771 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:45 crc kubenswrapper[4775]: I0127 11:47:45.983396 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:46 crc kubenswrapper[4775]: I0127 11:47:46.174910 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbxcd"] Jan 27 11:47:47 crc kubenswrapper[4775]: I0127 11:47:47.308544 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pbxcd" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="registry-server" containerID="cri-o://e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1" gracePeriod=2 Jan 27 11:47:47 crc kubenswrapper[4775]: I0127 11:47:47.854555 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.023592 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8ldx\" (UniqueName: \"kubernetes.io/projected/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-kube-api-access-z8ldx\") pod \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.023815 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-catalog-content\") pod \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.023867 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-utilities\") pod \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\" (UID: \"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f\") " Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.024812 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-utilities" (OuterVolumeSpecName: "utilities") pod "8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" (UID: "8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.029969 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-kube-api-access-z8ldx" (OuterVolumeSpecName: "kube-api-access-z8ldx") pod "8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" (UID: "8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f"). InnerVolumeSpecName "kube-api-access-z8ldx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.126875 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.127317 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8ldx\" (UniqueName: \"kubernetes.io/projected/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-kube-api-access-z8ldx\") on node \"crc\" DevicePath \"\"" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.145222 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" (UID: "8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.229045 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.319473 4775 generic.go:334] "Generic (PLEG): container finished" podID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerID="e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1" exitCode=0 Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.319584 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbxcd" event={"ID":"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f","Type":"ContainerDied","Data":"e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1"} Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.319734 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pbxcd" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.320814 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pbxcd" event={"ID":"8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f","Type":"ContainerDied","Data":"c86625840bf41ea9181153767b3d3a82a3b86875abf42fbf6fc07a9e94beac5b"} Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.320836 4775 scope.go:117] "RemoveContainer" containerID="e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.361666 4775 scope.go:117] "RemoveContainer" containerID="533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.407615 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pbxcd"] Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.415671 4775 scope.go:117] "RemoveContainer" containerID="d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.416788 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pbxcd"] Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.447470 4775 scope.go:117] "RemoveContainer" containerID="e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1" Jan 27 11:47:48 crc kubenswrapper[4775]: E0127 11:47:48.447853 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1\": container with ID starting with e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1 not found: ID does not exist" containerID="e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.447899 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1"} err="failed to get container status \"e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1\": rpc error: code = NotFound desc = could not find container \"e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1\": container with ID starting with e5dc98742593e226d3dfd73fced24791ddb58fc5d33d67cbb157d991145b16d1 not found: ID does not exist" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.447927 4775 scope.go:117] "RemoveContainer" containerID="533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4" Jan 27 11:47:48 crc kubenswrapper[4775]: E0127 11:47:48.448120 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4\": container with ID starting with 533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4 not found: ID does not exist" containerID="533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.448148 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4"} err="failed to get container status \"533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4\": rpc error: code = NotFound desc = could not find container \"533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4\": container with ID starting with 533d5a67c225b84e71e506af7d2fb223fb17986efcb2bf4bfb0a87a488c48da4 not found: ID does not exist" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.448171 4775 scope.go:117] "RemoveContainer" containerID="d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0" Jan 27 11:47:48 crc kubenswrapper[4775]: E0127 11:47:48.448959 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0\": container with ID starting with d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0 not found: ID does not exist" containerID="d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0" Jan 27 11:47:48 crc kubenswrapper[4775]: I0127 11:47:48.448985 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0"} err="failed to get container status \"d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0\": rpc error: code = NotFound desc = could not find container \"d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0\": container with ID starting with d0e91d64be3bd1464dd186f9bd9c7730942003ded1958c353082b1e12528c5a0 not found: ID does not exist" Jan 27 11:47:49 crc kubenswrapper[4775]: I0127 11:47:49.757405 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" path="/var/lib/kubelet/pods/8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f/volumes" Jan 27 11:47:50 crc kubenswrapper[4775]: I0127 11:47:50.034528 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-gcjrx"] Jan 27 11:47:50 crc kubenswrapper[4775]: I0127 11:47:50.042961 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-gcjrx"] Jan 27 11:47:50 crc kubenswrapper[4775]: I0127 11:47:50.744520 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:47:50 crc kubenswrapper[4775]: E0127 11:47:50.744793 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:47:51 crc kubenswrapper[4775]: I0127 11:47:51.033348 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-74wvb"] Jan 27 11:47:51 crc kubenswrapper[4775]: I0127 11:47:51.040358 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-74wvb"] Jan 27 11:47:51 crc kubenswrapper[4775]: I0127 11:47:51.757170 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c313125-cfde-424b-9bb3-acb232d20ba3" path="/var/lib/kubelet/pods/5c313125-cfde-424b-9bb3-acb232d20ba3/volumes" Jan 27 11:47:51 crc kubenswrapper[4775]: I0127 11:47:51.758047 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba461ef4-49c1-4edc-ac60-1dfb91642c46" path="/var/lib/kubelet/pods/ba461ef4-49c1-4edc-ac60-1dfb91642c46/volumes" Jan 27 11:48:00 crc kubenswrapper[4775]: I0127 11:48:00.046912 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-xbnrk"] Jan 27 11:48:00 crc kubenswrapper[4775]: I0127 11:48:00.057713 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-2nfbz"] Jan 27 11:48:00 crc kubenswrapper[4775]: I0127 11:48:00.069434 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-xbnrk"] Jan 27 11:48:00 crc kubenswrapper[4775]: I0127 11:48:00.082574 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-2nfbz"] Jan 27 11:48:00 crc kubenswrapper[4775]: I0127 11:48:00.427032 4775 generic.go:334] "Generic (PLEG): container finished" podID="e018489b-9445-4afb-8e4c-e9d52a6781d7" containerID="68fab69969ac252051443544579383fd831d8133c32cad9d9c4c67e6e0fe0911" exitCode=0 Jan 27 11:48:00 crc kubenswrapper[4775]: I0127 11:48:00.427165 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" event={"ID":"e018489b-9445-4afb-8e4c-e9d52a6781d7","Type":"ContainerDied","Data":"68fab69969ac252051443544579383fd831d8133c32cad9d9c4c67e6e0fe0911"} Jan 27 11:48:01 crc kubenswrapper[4775]: I0127 11:48:01.759027 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0edaeaa2-aa90-484f-854c-db5dd181f61b" path="/var/lib/kubelet/pods/0edaeaa2-aa90-484f-854c-db5dd181f61b/volumes" Jan 27 11:48:01 crc kubenswrapper[4775]: I0127 11:48:01.760331 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2029cc7b-c115-4c17-8713-c6eed291e963" path="/var/lib/kubelet/pods/2029cc7b-c115-4c17-8713-c6eed291e963/volumes" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.046265 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.135906 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfdjz\" (UniqueName: \"kubernetes.io/projected/e018489b-9445-4afb-8e4c-e9d52a6781d7-kube-api-access-zfdjz\") pod \"e018489b-9445-4afb-8e4c-e9d52a6781d7\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.136006 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-inventory\") pod \"e018489b-9445-4afb-8e4c-e9d52a6781d7\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.136032 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-ssh-key-openstack-edpm-ipam\") pod \"e018489b-9445-4afb-8e4c-e9d52a6781d7\" (UID: \"e018489b-9445-4afb-8e4c-e9d52a6781d7\") " Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.143218 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e018489b-9445-4afb-8e4c-e9d52a6781d7-kube-api-access-zfdjz" (OuterVolumeSpecName: "kube-api-access-zfdjz") pod "e018489b-9445-4afb-8e4c-e9d52a6781d7" (UID: "e018489b-9445-4afb-8e4c-e9d52a6781d7"). InnerVolumeSpecName "kube-api-access-zfdjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.164196 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e018489b-9445-4afb-8e4c-e9d52a6781d7" (UID: "e018489b-9445-4afb-8e4c-e9d52a6781d7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.164641 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-inventory" (OuterVolumeSpecName: "inventory") pod "e018489b-9445-4afb-8e4c-e9d52a6781d7" (UID: "e018489b-9445-4afb-8e4c-e9d52a6781d7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.238307 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.238342 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e018489b-9445-4afb-8e4c-e9d52a6781d7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.238354 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfdjz\" (UniqueName: \"kubernetes.io/projected/e018489b-9445-4afb-8e4c-e9d52a6781d7-kube-api-access-zfdjz\") on node \"crc\" DevicePath \"\"" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.447610 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" event={"ID":"e018489b-9445-4afb-8e4c-e9d52a6781d7","Type":"ContainerDied","Data":"3217b89f2dac713472e7b3a18905a4d7a31e80c6ce8d152dbe6348dc51a98d1a"} Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.447659 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3217b89f2dac713472e7b3a18905a4d7a31e80c6ce8d152dbe6348dc51a98d1a" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.447633 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-wskgh" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.526527 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk"] Jan 27 11:48:02 crc kubenswrapper[4775]: E0127 11:48:02.527232 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="extract-content" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.527250 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="extract-content" Jan 27 11:48:02 crc kubenswrapper[4775]: E0127 11:48:02.527267 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e018489b-9445-4afb-8e4c-e9d52a6781d7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.527275 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="e018489b-9445-4afb-8e4c-e9d52a6781d7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 27 11:48:02 crc kubenswrapper[4775]: E0127 11:48:02.527289 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="extract-utilities" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.527295 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="extract-utilities" Jan 27 11:48:02 crc kubenswrapper[4775]: E0127 11:48:02.527305 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="registry-server" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.527310 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="registry-server" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.527507 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fa2dbd0-fdfb-454e-bd33-cea5d8acca5f" containerName="registry-server" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.527528 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="e018489b-9445-4afb-8e4c-e9d52a6781d7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.528175 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.530879 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.530890 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.531165 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.531244 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.537223 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk"] Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.646434 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.646576 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8f6b\" (UniqueName: \"kubernetes.io/projected/d688b7ee-365a-441b-a0ab-3d1cf6663988-kube-api-access-m8f6b\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.646649 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.748463 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.748528 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8f6b\" (UniqueName: \"kubernetes.io/projected/d688b7ee-365a-441b-a0ab-3d1cf6663988-kube-api-access-m8f6b\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.748573 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.756682 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.758693 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.770536 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8f6b\" (UniqueName: \"kubernetes.io/projected/d688b7ee-365a-441b-a0ab-3d1cf6663988-kube-api-access-m8f6b\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-spnbk\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:02 crc kubenswrapper[4775]: I0127 11:48:02.844807 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:48:03 crc kubenswrapper[4775]: I0127 11:48:03.364183 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk"] Jan 27 11:48:03 crc kubenswrapper[4775]: I0127 11:48:03.457916 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" event={"ID":"d688b7ee-365a-441b-a0ab-3d1cf6663988","Type":"ContainerStarted","Data":"36ef75fcf9ba98617a57f9808118484a931f51a9acb0eecb092d3540da321512"} Jan 27 11:48:03 crc kubenswrapper[4775]: I0127 11:48:03.745177 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:48:03 crc kubenswrapper[4775]: E0127 11:48:03.745551 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:48:04 crc kubenswrapper[4775]: I0127 11:48:04.467347 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" event={"ID":"d688b7ee-365a-441b-a0ab-3d1cf6663988","Type":"ContainerStarted","Data":"b580768fb47819c4080ebc1a9b28f6e0e4fb153c7d9ea8cd8313d656fe7197f8"} Jan 27 11:48:04 crc kubenswrapper[4775]: I0127 11:48:04.485193 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" podStartSLOduration=1.8738177660000002 podStartE2EDuration="2.485175982s" podCreationTimestamp="2026-01-27 11:48:02 +0000 UTC" firstStartedPulling="2026-01-27 11:48:03.364468459 +0000 UTC m=+1662.506066256" lastFinishedPulling="2026-01-27 11:48:03.975826695 +0000 UTC m=+1663.117424472" observedRunningTime="2026-01-27 11:48:04.482892842 +0000 UTC m=+1663.624490619" watchObservedRunningTime="2026-01-27 11:48:04.485175982 +0000 UTC m=+1663.626773759" Jan 27 11:48:14 crc kubenswrapper[4775]: I0127 11:48:14.745588 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:48:14 crc kubenswrapper[4775]: E0127 11:48:14.746375 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:48:29 crc kubenswrapper[4775]: I0127 11:48:29.745160 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:48:29 crc kubenswrapper[4775]: E0127 11:48:29.745924 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.040275 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-6423-account-create-update-h7gvh"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.053147 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-6423-account-create-update-h7gvh"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.061795 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-p9q28"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.070500 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-tfv9j"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.078803 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-tfv9j"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.091419 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-p9q28"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.107489 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-k4m7t"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.116120 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-k4m7t"] Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.759136 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="027dfac2-8504-46aa-9302-19df71441688" path="/var/lib/kubelet/pods/027dfac2-8504-46aa-9302-19df71441688/volumes" Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.759986 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47764d9e-0435-43b7-aa95-e0a7e0d8b9c1" path="/var/lib/kubelet/pods/47764d9e-0435-43b7-aa95-e0a7e0d8b9c1/volumes" Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.760528 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b03d69b1-c651-4b79-9ba1-581dc15737a6" path="/var/lib/kubelet/pods/b03d69b1-c651-4b79-9ba1-581dc15737a6/volumes" Jan 27 11:48:37 crc kubenswrapper[4775]: I0127 11:48:37.761063 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6287027-2778-4115-b173-62b1600d0247" path="/var/lib/kubelet/pods/d6287027-2778-4115-b173-62b1600d0247/volumes" Jan 27 11:48:39 crc kubenswrapper[4775]: I0127 11:48:39.027064 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-8850-account-create-update-bwmll"] Jan 27 11:48:39 crc kubenswrapper[4775]: I0127 11:48:39.036206 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8d66-account-create-update-qwzzn"] Jan 27 11:48:39 crc kubenswrapper[4775]: I0127 11:48:39.043357 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8d66-account-create-update-qwzzn"] Jan 27 11:48:39 crc kubenswrapper[4775]: I0127 11:48:39.050033 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-8850-account-create-update-bwmll"] Jan 27 11:48:39 crc kubenswrapper[4775]: I0127 11:48:39.757863 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a18608c5-afda-4481-9c6d-a576dfd4d803" path="/var/lib/kubelet/pods/a18608c5-afda-4481-9c6d-a576dfd4d803/volumes" Jan 27 11:48:39 crc kubenswrapper[4775]: I0127 11:48:39.759147 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeed29be-d561-4bf4-bdc1-c180e1983a3c" path="/var/lib/kubelet/pods/aeed29be-d561-4bf4-bdc1-c180e1983a3c/volumes" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.036785 4775 scope.go:117] "RemoveContainer" containerID="4caff9acfbabff5d43e064a2dae71d1faf921323c384955f825a0b026f90243f" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.062439 4775 scope.go:117] "RemoveContainer" containerID="f1da3c93241fe74774825dab64f2ef30084cf90829cd29690c1d5d1e607b82cf" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.114867 4775 scope.go:117] "RemoveContainer" containerID="41709560e0a135bfad172581c43697731478b69553f5d48646b5f6b88ba2d017" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.174189 4775 scope.go:117] "RemoveContainer" containerID="7ef3f2b53db6801d250b8f062a4c055cb74eb877a306cd9ed1f923e6a13337a5" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.203062 4775 scope.go:117] "RemoveContainer" containerID="d4146f8956305fcd5ed343f07c424f8688cf68dfdc28b629aab55c50f738bb32" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.284148 4775 scope.go:117] "RemoveContainer" containerID="3e39eecfe6e3fc9edcef832aba89c2b8bb839bad8f9d02052e6eb7c6e0e5266b" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.303478 4775 scope.go:117] "RemoveContainer" containerID="a3091380a3b190141025c92d1747551aef9bfe0d5a0a8fe21ec59422863e92d3" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.321133 4775 scope.go:117] "RemoveContainer" containerID="23b16c9948b130a40404980a7031b163bab9fc293057be41f8d97640f61ddc95" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.337234 4775 scope.go:117] "RemoveContainer" containerID="398c82449e605705da69d826d01f9e9fe98c4e413ef45b6f729de523bb9ad912" Jan 27 11:48:41 crc kubenswrapper[4775]: I0127 11:48:41.359822 4775 scope.go:117] "RemoveContainer" containerID="1a9f2ed09821cb7a2fc3a6a56f74a7c65b7d39b4dfff4c1c07be78b154a6894c" Jan 27 11:48:44 crc kubenswrapper[4775]: I0127 11:48:44.745821 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:48:44 crc kubenswrapper[4775]: E0127 11:48:44.747094 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:48:57 crc kubenswrapper[4775]: I0127 11:48:57.747551 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:48:57 crc kubenswrapper[4775]: E0127 11:48:57.748426 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:49:08 crc kubenswrapper[4775]: I0127 11:49:08.745598 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:49:08 crc kubenswrapper[4775]: E0127 11:49:08.746512 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:49:12 crc kubenswrapper[4775]: I0127 11:49:12.036588 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6bh7g"] Jan 27 11:49:12 crc kubenswrapper[4775]: I0127 11:49:12.044977 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-6bh7g"] Jan 27 11:49:13 crc kubenswrapper[4775]: I0127 11:49:13.754444 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5e7b0a-a4d0-4c64-b273-2b47230efd17" path="/var/lib/kubelet/pods/1b5e7b0a-a4d0-4c64-b273-2b47230efd17/volumes" Jan 27 11:49:15 crc kubenswrapper[4775]: I0127 11:49:15.094823 4775 generic.go:334] "Generic (PLEG): container finished" podID="d688b7ee-365a-441b-a0ab-3d1cf6663988" containerID="b580768fb47819c4080ebc1a9b28f6e0e4fb153c7d9ea8cd8313d656fe7197f8" exitCode=0 Jan 27 11:49:15 crc kubenswrapper[4775]: I0127 11:49:15.094909 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" event={"ID":"d688b7ee-365a-441b-a0ab-3d1cf6663988","Type":"ContainerDied","Data":"b580768fb47819c4080ebc1a9b28f6e0e4fb153c7d9ea8cd8313d656fe7197f8"} Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.555142 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.691901 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-inventory\") pod \"d688b7ee-365a-441b-a0ab-3d1cf6663988\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.691977 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8f6b\" (UniqueName: \"kubernetes.io/projected/d688b7ee-365a-441b-a0ab-3d1cf6663988-kube-api-access-m8f6b\") pod \"d688b7ee-365a-441b-a0ab-3d1cf6663988\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.692029 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-ssh-key-openstack-edpm-ipam\") pod \"d688b7ee-365a-441b-a0ab-3d1cf6663988\" (UID: \"d688b7ee-365a-441b-a0ab-3d1cf6663988\") " Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.697317 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d688b7ee-365a-441b-a0ab-3d1cf6663988-kube-api-access-m8f6b" (OuterVolumeSpecName: "kube-api-access-m8f6b") pod "d688b7ee-365a-441b-a0ab-3d1cf6663988" (UID: "d688b7ee-365a-441b-a0ab-3d1cf6663988"). InnerVolumeSpecName "kube-api-access-m8f6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.717212 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-inventory" (OuterVolumeSpecName: "inventory") pod "d688b7ee-365a-441b-a0ab-3d1cf6663988" (UID: "d688b7ee-365a-441b-a0ab-3d1cf6663988"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.721220 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d688b7ee-365a-441b-a0ab-3d1cf6663988" (UID: "d688b7ee-365a-441b-a0ab-3d1cf6663988"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.795266 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.795301 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8f6b\" (UniqueName: \"kubernetes.io/projected/d688b7ee-365a-441b-a0ab-3d1cf6663988-kube-api-access-m8f6b\") on node \"crc\" DevicePath \"\"" Jan 27 11:49:16 crc kubenswrapper[4775]: I0127 11:49:16.795312 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d688b7ee-365a-441b-a0ab-3d1cf6663988-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.115000 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" event={"ID":"d688b7ee-365a-441b-a0ab-3d1cf6663988","Type":"ContainerDied","Data":"36ef75fcf9ba98617a57f9808118484a931f51a9acb0eecb092d3540da321512"} Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.115034 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36ef75fcf9ba98617a57f9808118484a931f51a9acb0eecb092d3540da321512" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.115487 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-spnbk" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.190194 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8"] Jan 27 11:49:17 crc kubenswrapper[4775]: E0127 11:49:17.190590 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d688b7ee-365a-441b-a0ab-3d1cf6663988" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.190606 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d688b7ee-365a-441b-a0ab-3d1cf6663988" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.190793 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d688b7ee-365a-441b-a0ab-3d1cf6663988" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.191369 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.194335 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.194563 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.194616 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.194869 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.202492 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4mzr\" (UniqueName: \"kubernetes.io/projected/6b092f27-cfd0-4c25-beab-c347f14371a1-kube-api-access-l4mzr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.202549 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.202689 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.203225 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8"] Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.304696 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.304832 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4mzr\" (UniqueName: \"kubernetes.io/projected/6b092f27-cfd0-4c25-beab-c347f14371a1-kube-api-access-l4mzr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.304865 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.309688 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.312550 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.320704 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4mzr\" (UniqueName: \"kubernetes.io/projected/6b092f27-cfd0-4c25-beab-c347f14371a1-kube-api-access-l4mzr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.510146 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:17 crc kubenswrapper[4775]: I0127 11:49:17.993073 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8"] Jan 27 11:49:18 crc kubenswrapper[4775]: I0127 11:49:18.122791 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" event={"ID":"6b092f27-cfd0-4c25-beab-c347f14371a1","Type":"ContainerStarted","Data":"dee58003ba4ce76b4b6a42e673be463081fac00f8233ca93ab2ca8a8c19ca705"} Jan 27 11:49:19 crc kubenswrapper[4775]: I0127 11:49:19.131960 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" event={"ID":"6b092f27-cfd0-4c25-beab-c347f14371a1","Type":"ContainerStarted","Data":"d047872ce4a2bc067dcda261ffd4a61fc03ddf035c94d7e8d87fa6cdc8f416c1"} Jan 27 11:49:19 crc kubenswrapper[4775]: I0127 11:49:19.148260 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" podStartSLOduration=1.64297764 podStartE2EDuration="2.14822633s" podCreationTimestamp="2026-01-27 11:49:17 +0000 UTC" firstStartedPulling="2026-01-27 11:49:17.999888833 +0000 UTC m=+1737.141486610" lastFinishedPulling="2026-01-27 11:49:18.505137513 +0000 UTC m=+1737.646735300" observedRunningTime="2026-01-27 11:49:19.144818131 +0000 UTC m=+1738.286415908" watchObservedRunningTime="2026-01-27 11:49:19.14822633 +0000 UTC m=+1738.289824107" Jan 27 11:49:19 crc kubenswrapper[4775]: I0127 11:49:19.745252 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:49:19 crc kubenswrapper[4775]: E0127 11:49:19.745517 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:49:24 crc kubenswrapper[4775]: I0127 11:49:24.177026 4775 generic.go:334] "Generic (PLEG): container finished" podID="6b092f27-cfd0-4c25-beab-c347f14371a1" containerID="d047872ce4a2bc067dcda261ffd4a61fc03ddf035c94d7e8d87fa6cdc8f416c1" exitCode=0 Jan 27 11:49:24 crc kubenswrapper[4775]: I0127 11:49:24.177137 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" event={"ID":"6b092f27-cfd0-4c25-beab-c347f14371a1","Type":"ContainerDied","Data":"d047872ce4a2bc067dcda261ffd4a61fc03ddf035c94d7e8d87fa6cdc8f416c1"} Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.661281 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.766005 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-ssh-key-openstack-edpm-ipam\") pod \"6b092f27-cfd0-4c25-beab-c347f14371a1\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.766117 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4mzr\" (UniqueName: \"kubernetes.io/projected/6b092f27-cfd0-4c25-beab-c347f14371a1-kube-api-access-l4mzr\") pod \"6b092f27-cfd0-4c25-beab-c347f14371a1\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.766186 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-inventory\") pod \"6b092f27-cfd0-4c25-beab-c347f14371a1\" (UID: \"6b092f27-cfd0-4c25-beab-c347f14371a1\") " Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.771731 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b092f27-cfd0-4c25-beab-c347f14371a1-kube-api-access-l4mzr" (OuterVolumeSpecName: "kube-api-access-l4mzr") pod "6b092f27-cfd0-4c25-beab-c347f14371a1" (UID: "6b092f27-cfd0-4c25-beab-c347f14371a1"). InnerVolumeSpecName "kube-api-access-l4mzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.794534 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6b092f27-cfd0-4c25-beab-c347f14371a1" (UID: "6b092f27-cfd0-4c25-beab-c347f14371a1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.816350 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-inventory" (OuterVolumeSpecName: "inventory") pod "6b092f27-cfd0-4c25-beab-c347f14371a1" (UID: "6b092f27-cfd0-4c25-beab-c347f14371a1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.870237 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.870283 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4mzr\" (UniqueName: \"kubernetes.io/projected/6b092f27-cfd0-4c25-beab-c347f14371a1-kube-api-access-l4mzr\") on node \"crc\" DevicePath \"\"" Jan 27 11:49:25 crc kubenswrapper[4775]: I0127 11:49:25.870297 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b092f27-cfd0-4c25-beab-c347f14371a1-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.204201 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" event={"ID":"6b092f27-cfd0-4c25-beab-c347f14371a1","Type":"ContainerDied","Data":"dee58003ba4ce76b4b6a42e673be463081fac00f8233ca93ab2ca8a8c19ca705"} Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.204263 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dee58003ba4ce76b4b6a42e673be463081fac00f8233ca93ab2ca8a8c19ca705" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.204745 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.354147 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z"] Jan 27 11:49:26 crc kubenswrapper[4775]: E0127 11:49:26.354612 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b092f27-cfd0-4c25-beab-c347f14371a1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.354640 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b092f27-cfd0-4c25-beab-c347f14371a1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.354904 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b092f27-cfd0-4c25-beab-c347f14371a1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.355693 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.357762 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.358101 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.358636 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.358677 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.379113 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z"] Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.379303 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dqcs\" (UniqueName: \"kubernetes.io/projected/2a28c09e-4891-433d-a745-f3dcfc8654aa-kube-api-access-2dqcs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.379466 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.379564 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.481194 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dqcs\" (UniqueName: \"kubernetes.io/projected/2a28c09e-4891-433d-a745-f3dcfc8654aa-kube-api-access-2dqcs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.481317 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.481385 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.485357 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.485502 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.507798 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dqcs\" (UniqueName: \"kubernetes.io/projected/2a28c09e-4891-433d-a745-f3dcfc8654aa-kube-api-access-2dqcs\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-87v8z\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:26 crc kubenswrapper[4775]: I0127 11:49:26.671999 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:49:27 crc kubenswrapper[4775]: I0127 11:49:27.220986 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z"] Jan 27 11:49:28 crc kubenswrapper[4775]: I0127 11:49:28.224689 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" event={"ID":"2a28c09e-4891-433d-a745-f3dcfc8654aa","Type":"ContainerStarted","Data":"42ce5cf829a9ef85991d620c248ee5b5c57c48506d27e8351a295722eb4a200d"} Jan 27 11:49:28 crc kubenswrapper[4775]: I0127 11:49:28.224975 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" event={"ID":"2a28c09e-4891-433d-a745-f3dcfc8654aa","Type":"ContainerStarted","Data":"a11cee897461bda69ca78f33345efedd73ce30c8550e139573da4d22c682184b"} Jan 27 11:49:28 crc kubenswrapper[4775]: I0127 11:49:28.249797 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" podStartSLOduration=1.8414507 podStartE2EDuration="2.249777084s" podCreationTimestamp="2026-01-27 11:49:26 +0000 UTC" firstStartedPulling="2026-01-27 11:49:27.229461797 +0000 UTC m=+1746.371059574" lastFinishedPulling="2026-01-27 11:49:27.637788141 +0000 UTC m=+1746.779385958" observedRunningTime="2026-01-27 11:49:28.239565297 +0000 UTC m=+1747.381163074" watchObservedRunningTime="2026-01-27 11:49:28.249777084 +0000 UTC m=+1747.391374861" Jan 27 11:49:34 crc kubenswrapper[4775]: I0127 11:49:34.744699 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:49:34 crc kubenswrapper[4775]: E0127 11:49:34.746505 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:49:41 crc kubenswrapper[4775]: I0127 11:49:41.556402 4775 scope.go:117] "RemoveContainer" containerID="cd7130b87032009eafbd9299811458b2c0b7a08141bac0e7bfbe791fc49ad4d0" Jan 27 11:49:46 crc kubenswrapper[4775]: I0127 11:49:46.745295 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:49:46 crc kubenswrapper[4775]: E0127 11:49:46.746514 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:49:58 crc kubenswrapper[4775]: I0127 11:49:58.749812 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:49:58 crc kubenswrapper[4775]: E0127 11:49:58.752654 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:50:07 crc kubenswrapper[4775]: I0127 11:50:07.581058 4775 generic.go:334] "Generic (PLEG): container finished" podID="2a28c09e-4891-433d-a745-f3dcfc8654aa" containerID="42ce5cf829a9ef85991d620c248ee5b5c57c48506d27e8351a295722eb4a200d" exitCode=0 Jan 27 11:50:07 crc kubenswrapper[4775]: I0127 11:50:07.581186 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" event={"ID":"2a28c09e-4891-433d-a745-f3dcfc8654aa","Type":"ContainerDied","Data":"42ce5cf829a9ef85991d620c248ee5b5c57c48506d27e8351a295722eb4a200d"} Jan 27 11:50:08 crc kubenswrapper[4775]: I0127 11:50:08.042756 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-m2t9b"] Jan 27 11:50:08 crc kubenswrapper[4775]: I0127 11:50:08.055597 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-m2t9b"] Jan 27 11:50:08 crc kubenswrapper[4775]: I0127 11:50:08.990176 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.037863 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xh4b2"] Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.043799 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-xh4b2"] Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.178634 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-ssh-key-openstack-edpm-ipam\") pod \"2a28c09e-4891-433d-a745-f3dcfc8654aa\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.178705 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dqcs\" (UniqueName: \"kubernetes.io/projected/2a28c09e-4891-433d-a745-f3dcfc8654aa-kube-api-access-2dqcs\") pod \"2a28c09e-4891-433d-a745-f3dcfc8654aa\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.178792 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-inventory\") pod \"2a28c09e-4891-433d-a745-f3dcfc8654aa\" (UID: \"2a28c09e-4891-433d-a745-f3dcfc8654aa\") " Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.185200 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a28c09e-4891-433d-a745-f3dcfc8654aa-kube-api-access-2dqcs" (OuterVolumeSpecName: "kube-api-access-2dqcs") pod "2a28c09e-4891-433d-a745-f3dcfc8654aa" (UID: "2a28c09e-4891-433d-a745-f3dcfc8654aa"). InnerVolumeSpecName "kube-api-access-2dqcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.216674 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2a28c09e-4891-433d-a745-f3dcfc8654aa" (UID: "2a28c09e-4891-433d-a745-f3dcfc8654aa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.218644 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-inventory" (OuterVolumeSpecName: "inventory") pod "2a28c09e-4891-433d-a745-f3dcfc8654aa" (UID: "2a28c09e-4891-433d-a745-f3dcfc8654aa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.281984 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.282174 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2a28c09e-4891-433d-a745-f3dcfc8654aa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.282206 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dqcs\" (UniqueName: \"kubernetes.io/projected/2a28c09e-4891-433d-a745-f3dcfc8654aa-kube-api-access-2dqcs\") on node \"crc\" DevicePath \"\"" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.597247 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" event={"ID":"2a28c09e-4891-433d-a745-f3dcfc8654aa","Type":"ContainerDied","Data":"a11cee897461bda69ca78f33345efedd73ce30c8550e139573da4d22c682184b"} Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.597296 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a11cee897461bda69ca78f33345efedd73ce30c8550e139573da4d22c682184b" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.597311 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-87v8z" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.689267 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb"] Jan 27 11:50:09 crc kubenswrapper[4775]: E0127 11:50:09.690125 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a28c09e-4891-433d-a745-f3dcfc8654aa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.690159 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a28c09e-4891-433d-a745-f3dcfc8654aa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.690611 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a28c09e-4891-433d-a745-f3dcfc8654aa" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.691403 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.695161 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.695299 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.695595 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.695715 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.700637 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb"] Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.754135 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8726531a-a74e-48cd-a274-6f67ae507560" path="/var/lib/kubelet/pods/8726531a-a74e-48cd-a274-6f67ae507560/volumes" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.754855 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3942760-c6b4-43b5-9680-48d8b8ae3854" path="/var/lib/kubelet/pods/a3942760-c6b4-43b5-9680-48d8b8ae3854/volumes" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.790117 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.790173 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhblj\" (UniqueName: \"kubernetes.io/projected/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-kube-api-access-dhblj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.790470 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.892806 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhblj\" (UniqueName: \"kubernetes.io/projected/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-kube-api-access-dhblj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.893087 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.894271 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.899398 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.902763 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:09 crc kubenswrapper[4775]: I0127 11:50:09.911632 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhblj\" (UniqueName: \"kubernetes.io/projected/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-kube-api-access-dhblj\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:10 crc kubenswrapper[4775]: I0127 11:50:10.023700 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:50:10 crc kubenswrapper[4775]: I0127 11:50:10.694836 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb"] Jan 27 11:50:11 crc kubenswrapper[4775]: I0127 11:50:11.612160 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" event={"ID":"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3","Type":"ContainerStarted","Data":"d673605088dadc0d4c3014041ac36f277af63b5e907303ca06a0df62c8850fed"} Jan 27 11:50:11 crc kubenswrapper[4775]: I0127 11:50:11.612677 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" event={"ID":"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3","Type":"ContainerStarted","Data":"4c29239fe93b78dbaba46f5c2a3db15797113fc41cf5f228555eae27949deb8d"} Jan 27 11:50:11 crc kubenswrapper[4775]: I0127 11:50:11.636598 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" podStartSLOduration=2.089282853 podStartE2EDuration="2.636572877s" podCreationTimestamp="2026-01-27 11:50:09 +0000 UTC" firstStartedPulling="2026-01-27 11:50:10.697470211 +0000 UTC m=+1789.839067988" lastFinishedPulling="2026-01-27 11:50:11.244760245 +0000 UTC m=+1790.386358012" observedRunningTime="2026-01-27 11:50:11.631111598 +0000 UTC m=+1790.772709395" watchObservedRunningTime="2026-01-27 11:50:11.636572877 +0000 UTC m=+1790.778170654" Jan 27 11:50:13 crc kubenswrapper[4775]: I0127 11:50:13.745231 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:50:13 crc kubenswrapper[4775]: E0127 11:50:13.745747 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:50:25 crc kubenswrapper[4775]: I0127 11:50:25.744857 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:50:25 crc kubenswrapper[4775]: E0127 11:50:25.745745 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:50:40 crc kubenswrapper[4775]: I0127 11:50:40.745505 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:50:40 crc kubenswrapper[4775]: E0127 11:50:40.746698 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:50:41 crc kubenswrapper[4775]: I0127 11:50:41.636584 4775 scope.go:117] "RemoveContainer" containerID="7a301f6fdbdbc7fba26fdec2032cb9599d38e17acf3b3627d4e654dc3bc0fdb7" Jan 27 11:50:41 crc kubenswrapper[4775]: I0127 11:50:41.688278 4775 scope.go:117] "RemoveContainer" containerID="b754699b4de85074b5e141a6f2ae8704aa4f96f92dca88cac7a93ee7f041781e" Jan 27 11:50:51 crc kubenswrapper[4775]: I0127 11:50:51.745391 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:50:51 crc kubenswrapper[4775]: E0127 11:50:51.746232 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:50:52 crc kubenswrapper[4775]: I0127 11:50:52.049202 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-4lnkz"] Jan 27 11:50:52 crc kubenswrapper[4775]: I0127 11:50:52.067969 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-4lnkz"] Jan 27 11:50:53 crc kubenswrapper[4775]: I0127 11:50:53.756040 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b77cbe7c-5901-44d2-959f-5435b8adbc85" path="/var/lib/kubelet/pods/b77cbe7c-5901-44d2-959f-5435b8adbc85/volumes" Jan 27 11:51:05 crc kubenswrapper[4775]: I0127 11:51:05.745344 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:51:05 crc kubenswrapper[4775]: E0127 11:51:05.746412 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:51:06 crc kubenswrapper[4775]: I0127 11:51:06.125304 4775 generic.go:334] "Generic (PLEG): container finished" podID="a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3" containerID="d673605088dadc0d4c3014041ac36f277af63b5e907303ca06a0df62c8850fed" exitCode=0 Jan 27 11:51:06 crc kubenswrapper[4775]: I0127 11:51:06.125371 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" event={"ID":"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3","Type":"ContainerDied","Data":"d673605088dadc0d4c3014041ac36f277af63b5e907303ca06a0df62c8850fed"} Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.551969 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.663199 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-inventory\") pod \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.663380 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhblj\" (UniqueName: \"kubernetes.io/projected/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-kube-api-access-dhblj\") pod \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.663580 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-ssh-key-openstack-edpm-ipam\") pod \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\" (UID: \"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3\") " Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.669256 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-kube-api-access-dhblj" (OuterVolumeSpecName: "kube-api-access-dhblj") pod "a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3" (UID: "a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3"). InnerVolumeSpecName "kube-api-access-dhblj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.689199 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-inventory" (OuterVolumeSpecName: "inventory") pod "a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3" (UID: "a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.691282 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3" (UID: "a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.766145 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.766176 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhblj\" (UniqueName: \"kubernetes.io/projected/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-kube-api-access-dhblj\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:07 crc kubenswrapper[4775]: I0127 11:51:07.766188 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.143190 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" event={"ID":"a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3","Type":"ContainerDied","Data":"4c29239fe93b78dbaba46f5c2a3db15797113fc41cf5f228555eae27949deb8d"} Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.143233 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c29239fe93b78dbaba46f5c2a3db15797113fc41cf5f228555eae27949deb8d" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.143240 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.237951 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r78nv"] Jan 27 11:51:08 crc kubenswrapper[4775]: E0127 11:51:08.238359 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.238379 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.238562 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.239183 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.245116 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.245181 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.245256 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.245402 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.254136 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r78nv"] Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.276473 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.277103 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvzsm\" (UniqueName: \"kubernetes.io/projected/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-kube-api-access-wvzsm\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.277151 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.378908 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.379028 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvzsm\" (UniqueName: \"kubernetes.io/projected/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-kube-api-access-wvzsm\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.379079 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.385705 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.385777 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.394894 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvzsm\" (UniqueName: \"kubernetes.io/projected/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-kube-api-access-wvzsm\") pod \"ssh-known-hosts-edpm-deployment-r78nv\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:08 crc kubenswrapper[4775]: I0127 11:51:08.594112 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:09 crc kubenswrapper[4775]: I0127 11:51:09.110572 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r78nv"] Jan 27 11:51:09 crc kubenswrapper[4775]: I0127 11:51:09.111739 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 11:51:09 crc kubenswrapper[4775]: I0127 11:51:09.151265 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" event={"ID":"28d386bc-d48d-41e0-9ae2-bbe8f876ba10","Type":"ContainerStarted","Data":"31bc709272dd855fab3ad0897347bb4062288a5eee1244b5c251cd317220ea91"} Jan 27 11:51:10 crc kubenswrapper[4775]: I0127 11:51:10.160238 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" event={"ID":"28d386bc-d48d-41e0-9ae2-bbe8f876ba10","Type":"ContainerStarted","Data":"3e083d25e1102e6027f1153ce0b29f2478aa3c2ac91d859004a591b76193f8f6"} Jan 27 11:51:10 crc kubenswrapper[4775]: I0127 11:51:10.181993 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" podStartSLOduration=1.6244914879999999 podStartE2EDuration="2.181976121s" podCreationTimestamp="2026-01-27 11:51:08 +0000 UTC" firstStartedPulling="2026-01-27 11:51:09.111528168 +0000 UTC m=+1848.253125945" lastFinishedPulling="2026-01-27 11:51:09.669012801 +0000 UTC m=+1848.810610578" observedRunningTime="2026-01-27 11:51:10.173900361 +0000 UTC m=+1849.315498138" watchObservedRunningTime="2026-01-27 11:51:10.181976121 +0000 UTC m=+1849.323573898" Jan 27 11:51:17 crc kubenswrapper[4775]: I0127 11:51:17.211372 4775 generic.go:334] "Generic (PLEG): container finished" podID="28d386bc-d48d-41e0-9ae2-bbe8f876ba10" containerID="3e083d25e1102e6027f1153ce0b29f2478aa3c2ac91d859004a591b76193f8f6" exitCode=0 Jan 27 11:51:17 crc kubenswrapper[4775]: I0127 11:51:17.211441 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" event={"ID":"28d386bc-d48d-41e0-9ae2-bbe8f876ba10","Type":"ContainerDied","Data":"3e083d25e1102e6027f1153ce0b29f2478aa3c2ac91d859004a591b76193f8f6"} Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.694126 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.770718 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-ssh-key-openstack-edpm-ipam\") pod \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.770789 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvzsm\" (UniqueName: \"kubernetes.io/projected/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-kube-api-access-wvzsm\") pod \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.770856 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-inventory-0\") pod \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\" (UID: \"28d386bc-d48d-41e0-9ae2-bbe8f876ba10\") " Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.786705 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-kube-api-access-wvzsm" (OuterVolumeSpecName: "kube-api-access-wvzsm") pod "28d386bc-d48d-41e0-9ae2-bbe8f876ba10" (UID: "28d386bc-d48d-41e0-9ae2-bbe8f876ba10"). InnerVolumeSpecName "kube-api-access-wvzsm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.799771 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "28d386bc-d48d-41e0-9ae2-bbe8f876ba10" (UID: "28d386bc-d48d-41e0-9ae2-bbe8f876ba10"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.802647 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "28d386bc-d48d-41e0-9ae2-bbe8f876ba10" (UID: "28d386bc-d48d-41e0-9ae2-bbe8f876ba10"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.872847 4775 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.873114 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:18 crc kubenswrapper[4775]: I0127 11:51:18.873126 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvzsm\" (UniqueName: \"kubernetes.io/projected/28d386bc-d48d-41e0-9ae2-bbe8f876ba10-kube-api-access-wvzsm\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.240842 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" event={"ID":"28d386bc-d48d-41e0-9ae2-bbe8f876ba10","Type":"ContainerDied","Data":"31bc709272dd855fab3ad0897347bb4062288a5eee1244b5c251cd317220ea91"} Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.240891 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r78nv" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.240892 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31bc709272dd855fab3ad0897347bb4062288a5eee1244b5c251cd317220ea91" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.310856 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b"] Jan 27 11:51:19 crc kubenswrapper[4775]: E0127 11:51:19.311230 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d386bc-d48d-41e0-9ae2-bbe8f876ba10" containerName="ssh-known-hosts-edpm-deployment" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.311247 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d386bc-d48d-41e0-9ae2-bbe8f876ba10" containerName="ssh-known-hosts-edpm-deployment" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.311463 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d386bc-d48d-41e0-9ae2-bbe8f876ba10" containerName="ssh-known-hosts-edpm-deployment" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.312145 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.314662 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.314763 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.315600 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.315806 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.327155 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b"] Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.380348 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.380427 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbqvh\" (UniqueName: \"kubernetes.io/projected/f349798f-861c-4071-b418-61fe20227133-kube-api-access-zbqvh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.380539 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.482656 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.482760 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbqvh\" (UniqueName: \"kubernetes.io/projected/f349798f-861c-4071-b418-61fe20227133-kube-api-access-zbqvh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.482832 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.487285 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.488522 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.499011 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbqvh\" (UniqueName: \"kubernetes.io/projected/f349798f-861c-4071-b418-61fe20227133-kube-api-access-zbqvh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-fvf2b\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:19 crc kubenswrapper[4775]: I0127 11:51:19.631932 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:20 crc kubenswrapper[4775]: I0127 11:51:20.133913 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b"] Jan 27 11:51:20 crc kubenswrapper[4775]: I0127 11:51:20.250251 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" event={"ID":"f349798f-861c-4071-b418-61fe20227133","Type":"ContainerStarted","Data":"6fc84b175910104cc41f6779c5768ecbeadb244f51fad2bfffc6f93fa6a06bc9"} Jan 27 11:51:20 crc kubenswrapper[4775]: I0127 11:51:20.745637 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:51:20 crc kubenswrapper[4775]: E0127 11:51:20.746045 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:51:21 crc kubenswrapper[4775]: I0127 11:51:21.263786 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" event={"ID":"f349798f-861c-4071-b418-61fe20227133","Type":"ContainerStarted","Data":"42178019447f7257f5e008e9df173ca2c966588c5dbfd4f8b641c22ae15cc2fe"} Jan 27 11:51:21 crc kubenswrapper[4775]: I0127 11:51:21.293843 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" podStartSLOduration=1.868809076 podStartE2EDuration="2.293821812s" podCreationTimestamp="2026-01-27 11:51:19 +0000 UTC" firstStartedPulling="2026-01-27 11:51:20.142681141 +0000 UTC m=+1859.284278918" lastFinishedPulling="2026-01-27 11:51:20.567693867 +0000 UTC m=+1859.709291654" observedRunningTime="2026-01-27 11:51:21.28452596 +0000 UTC m=+1860.426123737" watchObservedRunningTime="2026-01-27 11:51:21.293821812 +0000 UTC m=+1860.435419589" Jan 27 11:51:29 crc kubenswrapper[4775]: I0127 11:51:29.360644 4775 generic.go:334] "Generic (PLEG): container finished" podID="f349798f-861c-4071-b418-61fe20227133" containerID="42178019447f7257f5e008e9df173ca2c966588c5dbfd4f8b641c22ae15cc2fe" exitCode=0 Jan 27 11:51:29 crc kubenswrapper[4775]: I0127 11:51:29.360654 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" event={"ID":"f349798f-861c-4071-b418-61fe20227133","Type":"ContainerDied","Data":"42178019447f7257f5e008e9df173ca2c966588c5dbfd4f8b641c22ae15cc2fe"} Jan 27 11:51:30 crc kubenswrapper[4775]: I0127 11:51:30.811040 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:30 crc kubenswrapper[4775]: I0127 11:51:30.922210 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-ssh-key-openstack-edpm-ipam\") pod \"f349798f-861c-4071-b418-61fe20227133\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " Jan 27 11:51:30 crc kubenswrapper[4775]: I0127 11:51:30.922480 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbqvh\" (UniqueName: \"kubernetes.io/projected/f349798f-861c-4071-b418-61fe20227133-kube-api-access-zbqvh\") pod \"f349798f-861c-4071-b418-61fe20227133\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " Jan 27 11:51:30 crc kubenswrapper[4775]: I0127 11:51:30.922605 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-inventory\") pod \"f349798f-861c-4071-b418-61fe20227133\" (UID: \"f349798f-861c-4071-b418-61fe20227133\") " Jan 27 11:51:30 crc kubenswrapper[4775]: I0127 11:51:30.928891 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f349798f-861c-4071-b418-61fe20227133-kube-api-access-zbqvh" (OuterVolumeSpecName: "kube-api-access-zbqvh") pod "f349798f-861c-4071-b418-61fe20227133" (UID: "f349798f-861c-4071-b418-61fe20227133"). InnerVolumeSpecName "kube-api-access-zbqvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:51:30 crc kubenswrapper[4775]: I0127 11:51:30.948036 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-inventory" (OuterVolumeSpecName: "inventory") pod "f349798f-861c-4071-b418-61fe20227133" (UID: "f349798f-861c-4071-b418-61fe20227133"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:30 crc kubenswrapper[4775]: I0127 11:51:30.950352 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f349798f-861c-4071-b418-61fe20227133" (UID: "f349798f-861c-4071-b418-61fe20227133"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.024379 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbqvh\" (UniqueName: \"kubernetes.io/projected/f349798f-861c-4071-b418-61fe20227133-kube-api-access-zbqvh\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.024410 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.024419 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f349798f-861c-4071-b418-61fe20227133-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.386198 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" event={"ID":"f349798f-861c-4071-b418-61fe20227133","Type":"ContainerDied","Data":"6fc84b175910104cc41f6779c5768ecbeadb244f51fad2bfffc6f93fa6a06bc9"} Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.386650 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fc84b175910104cc41f6779c5768ecbeadb244f51fad2bfffc6f93fa6a06bc9" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.386271 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-fvf2b" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.478174 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw"] Jan 27 11:51:31 crc kubenswrapper[4775]: E0127 11:51:31.478750 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f349798f-861c-4071-b418-61fe20227133" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.478775 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f349798f-861c-4071-b418-61fe20227133" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.479015 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f349798f-861c-4071-b418-61fe20227133" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.479852 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.481933 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.482076 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.482116 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.482138 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.500780 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw"] Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.639294 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.639669 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.639851 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf5tw\" (UniqueName: \"kubernetes.io/projected/ca771db8-558f-4e69-ba8c-37ed97f534b4-kube-api-access-jf5tw\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.742815 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.742962 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf5tw\" (UniqueName: \"kubernetes.io/projected/ca771db8-558f-4e69-ba8c-37ed97f534b4-kube-api-access-jf5tw\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.743106 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.748764 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.748944 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.776732 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf5tw\" (UniqueName: \"kubernetes.io/projected/ca771db8-558f-4e69-ba8c-37ed97f534b4-kube-api-access-jf5tw\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:31 crc kubenswrapper[4775]: I0127 11:51:31.796785 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:32 crc kubenswrapper[4775]: I0127 11:51:32.344046 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw"] Jan 27 11:51:32 crc kubenswrapper[4775]: I0127 11:51:32.395715 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" event={"ID":"ca771db8-558f-4e69-ba8c-37ed97f534b4","Type":"ContainerStarted","Data":"578c70f55017da3c24435deada18b1e5d205c9d599812df233628b270b420fe7"} Jan 27 11:51:33 crc kubenswrapper[4775]: I0127 11:51:33.405213 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" event={"ID":"ca771db8-558f-4e69-ba8c-37ed97f534b4","Type":"ContainerStarted","Data":"9f2013e59aa1a93a24aa1037b9603d6d08abdaa914435e6bc02746de27615738"} Jan 27 11:51:33 crc kubenswrapper[4775]: I0127 11:51:33.425345 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" podStartSLOduration=2.025447413 podStartE2EDuration="2.425328304s" podCreationTimestamp="2026-01-27 11:51:31 +0000 UTC" firstStartedPulling="2026-01-27 11:51:32.346633115 +0000 UTC m=+1871.488230892" lastFinishedPulling="2026-01-27 11:51:32.746514006 +0000 UTC m=+1871.888111783" observedRunningTime="2026-01-27 11:51:33.419054292 +0000 UTC m=+1872.560652069" watchObservedRunningTime="2026-01-27 11:51:33.425328304 +0000 UTC m=+1872.566926081" Jan 27 11:51:35 crc kubenswrapper[4775]: I0127 11:51:35.746313 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:51:36 crc kubenswrapper[4775]: I0127 11:51:36.430930 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"a1dfd42e295ce83974192713f1280a3eb35fc52f0c8fcb222feb124fcbeb9753"} Jan 27 11:51:41 crc kubenswrapper[4775]: I0127 11:51:41.812198 4775 scope.go:117] "RemoveContainer" containerID="fee7236fa11e516e48176ea4ac10ecf99f92b8a3df878c241be649e46d2bcbab" Jan 27 11:51:42 crc kubenswrapper[4775]: I0127 11:51:42.480337 4775 generic.go:334] "Generic (PLEG): container finished" podID="ca771db8-558f-4e69-ba8c-37ed97f534b4" containerID="9f2013e59aa1a93a24aa1037b9603d6d08abdaa914435e6bc02746de27615738" exitCode=0 Jan 27 11:51:42 crc kubenswrapper[4775]: I0127 11:51:42.480378 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" event={"ID":"ca771db8-558f-4e69-ba8c-37ed97f534b4","Type":"ContainerDied","Data":"9f2013e59aa1a93a24aa1037b9603d6d08abdaa914435e6bc02746de27615738"} Jan 27 11:51:43 crc kubenswrapper[4775]: I0127 11:51:43.902612 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:43 crc kubenswrapper[4775]: I0127 11:51:43.980912 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-ssh-key-openstack-edpm-ipam\") pod \"ca771db8-558f-4e69-ba8c-37ed97f534b4\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " Jan 27 11:51:43 crc kubenswrapper[4775]: I0127 11:51:43.981369 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-inventory\") pod \"ca771db8-558f-4e69-ba8c-37ed97f534b4\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " Jan 27 11:51:43 crc kubenswrapper[4775]: I0127 11:51:43.981476 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jf5tw\" (UniqueName: \"kubernetes.io/projected/ca771db8-558f-4e69-ba8c-37ed97f534b4-kube-api-access-jf5tw\") pod \"ca771db8-558f-4e69-ba8c-37ed97f534b4\" (UID: \"ca771db8-558f-4e69-ba8c-37ed97f534b4\") " Jan 27 11:51:43 crc kubenswrapper[4775]: I0127 11:51:43.986861 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca771db8-558f-4e69-ba8c-37ed97f534b4-kube-api-access-jf5tw" (OuterVolumeSpecName: "kube-api-access-jf5tw") pod "ca771db8-558f-4e69-ba8c-37ed97f534b4" (UID: "ca771db8-558f-4e69-ba8c-37ed97f534b4"). InnerVolumeSpecName "kube-api-access-jf5tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.008599 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-inventory" (OuterVolumeSpecName: "inventory") pod "ca771db8-558f-4e69-ba8c-37ed97f534b4" (UID: "ca771db8-558f-4e69-ba8c-37ed97f534b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.010271 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ca771db8-558f-4e69-ba8c-37ed97f534b4" (UID: "ca771db8-558f-4e69-ba8c-37ed97f534b4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.084586 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.084636 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ca771db8-558f-4e69-ba8c-37ed97f534b4-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.084648 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jf5tw\" (UniqueName: \"kubernetes.io/projected/ca771db8-558f-4e69-ba8c-37ed97f534b4-kube-api-access-jf5tw\") on node \"crc\" DevicePath \"\"" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.497758 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" event={"ID":"ca771db8-558f-4e69-ba8c-37ed97f534b4","Type":"ContainerDied","Data":"578c70f55017da3c24435deada18b1e5d205c9d599812df233628b270b420fe7"} Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.497815 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="578c70f55017da3c24435deada18b1e5d205c9d599812df233628b270b420fe7" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.497819 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.588791 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf"] Jan 27 11:51:44 crc kubenswrapper[4775]: E0127 11:51:44.589290 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca771db8-558f-4e69-ba8c-37ed97f534b4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.589315 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca771db8-558f-4e69-ba8c-37ed97f534b4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.589593 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca771db8-558f-4e69-ba8c-37ed97f534b4" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.590420 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.592556 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.593039 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.593083 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.593208 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.593282 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.593408 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.595923 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.599751 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.601665 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf"] Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696313 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696407 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696623 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696786 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696829 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brh28\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-kube-api-access-brh28\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696887 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696912 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.696960 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.697134 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.697211 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.697232 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.697282 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.697309 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.697336 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798643 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798727 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798761 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brh28\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-kube-api-access-brh28\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798801 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798821 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798856 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798899 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798926 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798947 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798973 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.798993 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.799013 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.799033 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.799064 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.803873 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.805482 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.807297 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.808927 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.808960 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.809799 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.809858 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.810144 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.810287 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.811242 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.813135 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.814287 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.817955 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.818663 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brh28\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-kube-api-access-brh28\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-czgtf\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:44 crc kubenswrapper[4775]: I0127 11:51:44.915497 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:51:45 crc kubenswrapper[4775]: I0127 11:51:45.425317 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf"] Jan 27 11:51:45 crc kubenswrapper[4775]: I0127 11:51:45.507112 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" event={"ID":"d002bd2d-2dcd-4ba3-841b-1306c023469b","Type":"ContainerStarted","Data":"b7c09cabc878d4d20c5d5e32768ca7f091ebce91f8993f1d3243f457d5f3df35"} Jan 27 11:51:46 crc kubenswrapper[4775]: I0127 11:51:46.517903 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" event={"ID":"d002bd2d-2dcd-4ba3-841b-1306c023469b","Type":"ContainerStarted","Data":"539928bf723e33e91c76ecd68a410f9ee0c444d91e281999babed305b374d93e"} Jan 27 11:51:46 crc kubenswrapper[4775]: I0127 11:51:46.544314 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" podStartSLOduration=2.104014596 podStartE2EDuration="2.544289561s" podCreationTimestamp="2026-01-27 11:51:44 +0000 UTC" firstStartedPulling="2026-01-27 11:51:45.429873187 +0000 UTC m=+1884.571470964" lastFinishedPulling="2026-01-27 11:51:45.870148152 +0000 UTC m=+1885.011745929" observedRunningTime="2026-01-27 11:51:46.53458011 +0000 UTC m=+1885.676177907" watchObservedRunningTime="2026-01-27 11:51:46.544289561 +0000 UTC m=+1885.685887338" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.090327 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-clwvj"] Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.092565 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.110857 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-clwvj"] Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.209376 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-utilities\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.209509 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-catalog-content\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.209754 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff85z\" (UniqueName: \"kubernetes.io/projected/03479aab-2cb4-4bf4-b59d-399b66bdff65-kube-api-access-ff85z\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.311507 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-utilities\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.311653 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-catalog-content\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.311744 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff85z\" (UniqueName: \"kubernetes.io/projected/03479aab-2cb4-4bf4-b59d-399b66bdff65-kube-api-access-ff85z\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.312016 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-utilities\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.312490 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-catalog-content\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.332408 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff85z\" (UniqueName: \"kubernetes.io/projected/03479aab-2cb4-4bf4-b59d-399b66bdff65-kube-api-access-ff85z\") pod \"community-operators-clwvj\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.418754 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:04 crc kubenswrapper[4775]: I0127 11:52:04.924039 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-clwvj"] Jan 27 11:52:04 crc kubenswrapper[4775]: W0127 11:52:04.927073 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03479aab_2cb4_4bf4_b59d_399b66bdff65.slice/crio-28ad4b62192fe9ba584aabf8612666c2c71c19a2a3d532eb95f5bb71b89dd6d8 WatchSource:0}: Error finding container 28ad4b62192fe9ba584aabf8612666c2c71c19a2a3d532eb95f5bb71b89dd6d8: Status 404 returned error can't find the container with id 28ad4b62192fe9ba584aabf8612666c2c71c19a2a3d532eb95f5bb71b89dd6d8 Jan 27 11:52:05 crc kubenswrapper[4775]: I0127 11:52:05.684702 4775 generic.go:334] "Generic (PLEG): container finished" podID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerID="60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444" exitCode=0 Jan 27 11:52:05 crc kubenswrapper[4775]: I0127 11:52:05.684759 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clwvj" event={"ID":"03479aab-2cb4-4bf4-b59d-399b66bdff65","Type":"ContainerDied","Data":"60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444"} Jan 27 11:52:05 crc kubenswrapper[4775]: I0127 11:52:05.685114 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clwvj" event={"ID":"03479aab-2cb4-4bf4-b59d-399b66bdff65","Type":"ContainerStarted","Data":"28ad4b62192fe9ba584aabf8612666c2c71c19a2a3d532eb95f5bb71b89dd6d8"} Jan 27 11:52:06 crc kubenswrapper[4775]: I0127 11:52:06.697320 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clwvj" event={"ID":"03479aab-2cb4-4bf4-b59d-399b66bdff65","Type":"ContainerStarted","Data":"329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8"} Jan 27 11:52:07 crc kubenswrapper[4775]: I0127 11:52:07.707134 4775 generic.go:334] "Generic (PLEG): container finished" podID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerID="329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8" exitCode=0 Jan 27 11:52:07 crc kubenswrapper[4775]: I0127 11:52:07.707175 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clwvj" event={"ID":"03479aab-2cb4-4bf4-b59d-399b66bdff65","Type":"ContainerDied","Data":"329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8"} Jan 27 11:52:08 crc kubenswrapper[4775]: I0127 11:52:08.716569 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clwvj" event={"ID":"03479aab-2cb4-4bf4-b59d-399b66bdff65","Type":"ContainerStarted","Data":"b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529"} Jan 27 11:52:08 crc kubenswrapper[4775]: I0127 11:52:08.732490 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-clwvj" podStartSLOduration=2.259210566 podStartE2EDuration="4.732471654s" podCreationTimestamp="2026-01-27 11:52:04 +0000 UTC" firstStartedPulling="2026-01-27 11:52:05.688745069 +0000 UTC m=+1904.830342866" lastFinishedPulling="2026-01-27 11:52:08.162006187 +0000 UTC m=+1907.303603954" observedRunningTime="2026-01-27 11:52:08.730911503 +0000 UTC m=+1907.872509300" watchObservedRunningTime="2026-01-27 11:52:08.732471654 +0000 UTC m=+1907.874069431" Jan 27 11:52:14 crc kubenswrapper[4775]: I0127 11:52:14.418944 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:14 crc kubenswrapper[4775]: I0127 11:52:14.419417 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:14 crc kubenswrapper[4775]: I0127 11:52:14.461081 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:14 crc kubenswrapper[4775]: I0127 11:52:14.817187 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:16 crc kubenswrapper[4775]: I0127 11:52:16.483103 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-clwvj"] Jan 27 11:52:16 crc kubenswrapper[4775]: I0127 11:52:16.780128 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-clwvj" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="registry-server" containerID="cri-o://b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529" gracePeriod=2 Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.254547 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.267346 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-catalog-content\") pod \"03479aab-2cb4-4bf4-b59d-399b66bdff65\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.267591 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-utilities\") pod \"03479aab-2cb4-4bf4-b59d-399b66bdff65\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.267644 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff85z\" (UniqueName: \"kubernetes.io/projected/03479aab-2cb4-4bf4-b59d-399b66bdff65-kube-api-access-ff85z\") pod \"03479aab-2cb4-4bf4-b59d-399b66bdff65\" (UID: \"03479aab-2cb4-4bf4-b59d-399b66bdff65\") " Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.268413 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-utilities" (OuterVolumeSpecName: "utilities") pod "03479aab-2cb4-4bf4-b59d-399b66bdff65" (UID: "03479aab-2cb4-4bf4-b59d-399b66bdff65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.310122 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03479aab-2cb4-4bf4-b59d-399b66bdff65-kube-api-access-ff85z" (OuterVolumeSpecName: "kube-api-access-ff85z") pod "03479aab-2cb4-4bf4-b59d-399b66bdff65" (UID: "03479aab-2cb4-4bf4-b59d-399b66bdff65"). InnerVolumeSpecName "kube-api-access-ff85z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.369981 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.370017 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff85z\" (UniqueName: \"kubernetes.io/projected/03479aab-2cb4-4bf4-b59d-399b66bdff65-kube-api-access-ff85z\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.704911 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03479aab-2cb4-4bf4-b59d-399b66bdff65" (UID: "03479aab-2cb4-4bf4-b59d-399b66bdff65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.778831 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03479aab-2cb4-4bf4-b59d-399b66bdff65-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.789299 4775 generic.go:334] "Generic (PLEG): container finished" podID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerID="b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529" exitCode=0 Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.789351 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clwvj" event={"ID":"03479aab-2cb4-4bf4-b59d-399b66bdff65","Type":"ContainerDied","Data":"b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529"} Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.789382 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-clwvj" event={"ID":"03479aab-2cb4-4bf4-b59d-399b66bdff65","Type":"ContainerDied","Data":"28ad4b62192fe9ba584aabf8612666c2c71c19a2a3d532eb95f5bb71b89dd6d8"} Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.789400 4775 scope.go:117] "RemoveContainer" containerID="b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.789355 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-clwvj" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.817335 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-clwvj"] Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.817550 4775 scope.go:117] "RemoveContainer" containerID="329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.826507 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-clwvj"] Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.843554 4775 scope.go:117] "RemoveContainer" containerID="60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.882561 4775 scope.go:117] "RemoveContainer" containerID="b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529" Jan 27 11:52:17 crc kubenswrapper[4775]: E0127 11:52:17.883095 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529\": container with ID starting with b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529 not found: ID does not exist" containerID="b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.883156 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529"} err="failed to get container status \"b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529\": rpc error: code = NotFound desc = could not find container \"b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529\": container with ID starting with b325a43c12945707283e6eb76b1ea59397498238a0eef304469bed3c502ab529 not found: ID does not exist" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.883185 4775 scope.go:117] "RemoveContainer" containerID="329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8" Jan 27 11:52:17 crc kubenswrapper[4775]: E0127 11:52:17.883773 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8\": container with ID starting with 329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8 not found: ID does not exist" containerID="329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.883828 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8"} err="failed to get container status \"329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8\": rpc error: code = NotFound desc = could not find container \"329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8\": container with ID starting with 329641ae043de51fb2d06f7416abab519e8df91ff90c886c1ff52d0887c63fb8 not found: ID does not exist" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.883863 4775 scope.go:117] "RemoveContainer" containerID="60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444" Jan 27 11:52:17 crc kubenswrapper[4775]: E0127 11:52:17.884213 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444\": container with ID starting with 60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444 not found: ID does not exist" containerID="60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444" Jan 27 11:52:17 crc kubenswrapper[4775]: I0127 11:52:17.884245 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444"} err="failed to get container status \"60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444\": rpc error: code = NotFound desc = could not find container \"60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444\": container with ID starting with 60b8437bcdfec1c6e432fb259cde52b8a093ddbb56a874299fac3f3b1aff7444 not found: ID does not exist" Jan 27 11:52:19 crc kubenswrapper[4775]: I0127 11:52:19.766762 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" path="/var/lib/kubelet/pods/03479aab-2cb4-4bf4-b59d-399b66bdff65/volumes" Jan 27 11:52:25 crc kubenswrapper[4775]: I0127 11:52:25.866941 4775 generic.go:334] "Generic (PLEG): container finished" podID="d002bd2d-2dcd-4ba3-841b-1306c023469b" containerID="539928bf723e33e91c76ecd68a410f9ee0c444d91e281999babed305b374d93e" exitCode=0 Jan 27 11:52:25 crc kubenswrapper[4775]: I0127 11:52:25.867036 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" event={"ID":"d002bd2d-2dcd-4ba3-841b-1306c023469b","Type":"ContainerDied","Data":"539928bf723e33e91c76ecd68a410f9ee0c444d91e281999babed305b374d93e"} Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.322294 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.376884 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-ovn-default-certs-0\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.376924 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-inventory\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.376949 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-nova-combined-ca-bundle\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377015 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-libvirt-combined-ca-bundle\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377048 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377077 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377121 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ssh-key-openstack-edpm-ipam\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377241 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-repo-setup-combined-ca-bundle\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377309 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377340 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-telemetry-combined-ca-bundle\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377371 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-bootstrap-combined-ca-bundle\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377394 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ovn-combined-ca-bundle\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377442 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-neutron-metadata-combined-ca-bundle\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.377488 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brh28\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-kube-api-access-brh28\") pod \"d002bd2d-2dcd-4ba3-841b-1306c023469b\" (UID: \"d002bd2d-2dcd-4ba3-841b-1306c023469b\") " Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.385153 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.385202 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.385309 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.385406 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.385813 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-kube-api-access-brh28" (OuterVolumeSpecName: "kube-api-access-brh28") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "kube-api-access-brh28". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.385847 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.392884 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.392912 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.392942 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.393001 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.405475 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.405436 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.411943 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-inventory" (OuterVolumeSpecName: "inventory") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.418995 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d002bd2d-2dcd-4ba3-841b-1306c023469b" (UID: "d002bd2d-2dcd-4ba3-841b-1306c023469b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479669 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479702 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479714 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479723 4775 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479733 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479741 4775 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479754 4775 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479763 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479771 4775 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479782 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brh28\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-kube-api-access-brh28\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479791 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d002bd2d-2dcd-4ba3-841b-1306c023469b-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479800 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479808 4775 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.479818 4775 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d002bd2d-2dcd-4ba3-841b-1306c023469b-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.882799 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" event={"ID":"d002bd2d-2dcd-4ba3-841b-1306c023469b","Type":"ContainerDied","Data":"b7c09cabc878d4d20c5d5e32768ca7f091ebce91f8993f1d3243f457d5f3df35"} Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.882863 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7c09cabc878d4d20c5d5e32768ca7f091ebce91f8993f1d3243f457d5f3df35" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.882899 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-czgtf" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.997788 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g"] Jan 27 11:52:27 crc kubenswrapper[4775]: E0127 11:52:27.998215 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="extract-utilities" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.998235 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="extract-utilities" Jan 27 11:52:27 crc kubenswrapper[4775]: E0127 11:52:27.998251 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d002bd2d-2dcd-4ba3-841b-1306c023469b" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.998262 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="d002bd2d-2dcd-4ba3-841b-1306c023469b" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 27 11:52:27 crc kubenswrapper[4775]: E0127 11:52:27.998276 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="registry-server" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.998282 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="registry-server" Jan 27 11:52:27 crc kubenswrapper[4775]: E0127 11:52:27.998295 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="extract-content" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.998303 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="extract-content" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.998552 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="03479aab-2cb4-4bf4-b59d-399b66bdff65" containerName="registry-server" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.998568 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="d002bd2d-2dcd-4ba3-841b-1306c023469b" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 27 11:52:27 crc kubenswrapper[4775]: I0127 11:52:27.999187 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.002223 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.002587 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.003061 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.003293 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.003794 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.024141 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g"] Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.093367 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.093463 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.093517 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.093591 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhqsh\" (UniqueName: \"kubernetes.io/projected/41359e3c-21d7-4c22-bcef-0968c2f8cca5-kube-api-access-mhqsh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.093664 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.197264 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.197679 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.197720 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.197777 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhqsh\" (UniqueName: \"kubernetes.io/projected/41359e3c-21d7-4c22-bcef-0968c2f8cca5-kube-api-access-mhqsh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.197822 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.198537 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.214189 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.214228 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.219050 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.219793 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhqsh\" (UniqueName: \"kubernetes.io/projected/41359e3c-21d7-4c22-bcef-0968c2f8cca5-kube-api-access-mhqsh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-2p96g\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.321113 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.855102 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g"] Jan 27 11:52:28 crc kubenswrapper[4775]: I0127 11:52:28.894006 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" event={"ID":"41359e3c-21d7-4c22-bcef-0968c2f8cca5","Type":"ContainerStarted","Data":"591b945065340d2af9ce02b0c898d46f7f3ec06efb9e2761c316ea62ece87fac"} Jan 27 11:52:29 crc kubenswrapper[4775]: I0127 11:52:29.904660 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" event={"ID":"41359e3c-21d7-4c22-bcef-0968c2f8cca5","Type":"ContainerStarted","Data":"9697a23e2dbb1b8e963d7619cfa0d83a42288fa9edbea16263fdba46daa14d3d"} Jan 27 11:52:29 crc kubenswrapper[4775]: I0127 11:52:29.925213 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" podStartSLOduration=2.453045159 podStartE2EDuration="2.925190762s" podCreationTimestamp="2026-01-27 11:52:27 +0000 UTC" firstStartedPulling="2026-01-27 11:52:28.868073706 +0000 UTC m=+1928.009671473" lastFinishedPulling="2026-01-27 11:52:29.340219289 +0000 UTC m=+1928.481817076" observedRunningTime="2026-01-27 11:52:29.922946044 +0000 UTC m=+1929.064543831" watchObservedRunningTime="2026-01-27 11:52:29.925190762 +0000 UTC m=+1929.066788539" Jan 27 11:53:35 crc kubenswrapper[4775]: I0127 11:53:35.486637 4775 generic.go:334] "Generic (PLEG): container finished" podID="41359e3c-21d7-4c22-bcef-0968c2f8cca5" containerID="9697a23e2dbb1b8e963d7619cfa0d83a42288fa9edbea16263fdba46daa14d3d" exitCode=0 Jan 27 11:53:35 crc kubenswrapper[4775]: I0127 11:53:35.486744 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" event={"ID":"41359e3c-21d7-4c22-bcef-0968c2f8cca5","Type":"ContainerDied","Data":"9697a23e2dbb1b8e963d7619cfa0d83a42288fa9edbea16263fdba46daa14d3d"} Jan 27 11:53:36 crc kubenswrapper[4775]: I0127 11:53:36.946607 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.146735 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhqsh\" (UniqueName: \"kubernetes.io/projected/41359e3c-21d7-4c22-bcef-0968c2f8cca5-kube-api-access-mhqsh\") pod \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.146797 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovncontroller-config-0\") pod \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.146827 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ssh-key-openstack-edpm-ipam\") pod \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.146879 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovn-combined-ca-bundle\") pod \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.147098 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-inventory\") pod \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\" (UID: \"41359e3c-21d7-4c22-bcef-0968c2f8cca5\") " Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.154786 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "41359e3c-21d7-4c22-bcef-0968c2f8cca5" (UID: "41359e3c-21d7-4c22-bcef-0968c2f8cca5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.154822 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41359e3c-21d7-4c22-bcef-0968c2f8cca5-kube-api-access-mhqsh" (OuterVolumeSpecName: "kube-api-access-mhqsh") pod "41359e3c-21d7-4c22-bcef-0968c2f8cca5" (UID: "41359e3c-21d7-4c22-bcef-0968c2f8cca5"). InnerVolumeSpecName "kube-api-access-mhqsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.172500 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "41359e3c-21d7-4c22-bcef-0968c2f8cca5" (UID: "41359e3c-21d7-4c22-bcef-0968c2f8cca5"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.174386 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-inventory" (OuterVolumeSpecName: "inventory") pod "41359e3c-21d7-4c22-bcef-0968c2f8cca5" (UID: "41359e3c-21d7-4c22-bcef-0968c2f8cca5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.176410 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "41359e3c-21d7-4c22-bcef-0968c2f8cca5" (UID: "41359e3c-21d7-4c22-bcef-0968c2f8cca5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.249668 4775 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.249701 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.249710 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhqsh\" (UniqueName: \"kubernetes.io/projected/41359e3c-21d7-4c22-bcef-0968c2f8cca5-kube-api-access-mhqsh\") on node \"crc\" DevicePath \"\"" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.249720 4775 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.249730 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/41359e3c-21d7-4c22-bcef-0968c2f8cca5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.505726 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" event={"ID":"41359e3c-21d7-4c22-bcef-0968c2f8cca5","Type":"ContainerDied","Data":"591b945065340d2af9ce02b0c898d46f7f3ec06efb9e2761c316ea62ece87fac"} Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.505773 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="591b945065340d2af9ce02b0c898d46f7f3ec06efb9e2761c316ea62ece87fac" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.506409 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-2p96g" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.600747 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97"] Jan 27 11:53:37 crc kubenswrapper[4775]: E0127 11:53:37.601257 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41359e3c-21d7-4c22-bcef-0968c2f8cca5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.601283 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="41359e3c-21d7-4c22-bcef-0968c2f8cca5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.601603 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="41359e3c-21d7-4c22-bcef-0968c2f8cca5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.602615 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.609628 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.609649 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.609766 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.609796 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.609851 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.610028 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.612303 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97"] Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.763853 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.763918 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.763941 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.763970 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.764000 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.764081 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvrrj\" (UniqueName: \"kubernetes.io/projected/352eaecd-6d51-4198-b3e6-ce59a6485be1-kube-api-access-kvrrj\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.866095 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.866258 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvrrj\" (UniqueName: \"kubernetes.io/projected/352eaecd-6d51-4198-b3e6-ce59a6485be1-kube-api-access-kvrrj\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.866340 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.866386 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.866417 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.866491 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.869470 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.869568 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.869938 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.870439 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.876145 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.882596 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvrrj\" (UniqueName: \"kubernetes.io/projected/352eaecd-6d51-4198-b3e6-ce59a6485be1-kube-api-access-kvrrj\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:37 crc kubenswrapper[4775]: I0127 11:53:37.933026 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:53:38 crc kubenswrapper[4775]: I0127 11:53:38.435167 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97"] Jan 27 11:53:38 crc kubenswrapper[4775]: I0127 11:53:38.515305 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" event={"ID":"352eaecd-6d51-4198-b3e6-ce59a6485be1","Type":"ContainerStarted","Data":"124efc6595a2c07c8ba8c1e21002ee587b16c0e219e4d44dc27fd782a787b0c4"} Jan 27 11:53:40 crc kubenswrapper[4775]: I0127 11:53:40.536683 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" event={"ID":"352eaecd-6d51-4198-b3e6-ce59a6485be1","Type":"ContainerStarted","Data":"0f1ef1f39ca959bb951c060526e2967aa2da81bcf3a2df13c0ff4e7031c25e4b"} Jan 27 11:53:40 crc kubenswrapper[4775]: I0127 11:53:40.578163 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" podStartSLOduration=2.561401242 podStartE2EDuration="3.578121189s" podCreationTimestamp="2026-01-27 11:53:37 +0000 UTC" firstStartedPulling="2026-01-27 11:53:38.438497157 +0000 UTC m=+1997.580094934" lastFinishedPulling="2026-01-27 11:53:39.455217104 +0000 UTC m=+1998.596814881" observedRunningTime="2026-01-27 11:53:40.55966651 +0000 UTC m=+1999.701264337" watchObservedRunningTime="2026-01-27 11:53:40.578121189 +0000 UTC m=+1999.719718986" Jan 27 11:53:59 crc kubenswrapper[4775]: I0127 11:53:59.518102 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:53:59 crc kubenswrapper[4775]: I0127 11:53:59.518799 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:54:29 crc kubenswrapper[4775]: I0127 11:54:29.517577 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:54:29 crc kubenswrapper[4775]: I0127 11:54:29.518209 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:54:30 crc kubenswrapper[4775]: I0127 11:54:30.987444 4775 generic.go:334] "Generic (PLEG): container finished" podID="352eaecd-6d51-4198-b3e6-ce59a6485be1" containerID="0f1ef1f39ca959bb951c060526e2967aa2da81bcf3a2df13c0ff4e7031c25e4b" exitCode=0 Jan 27 11:54:30 crc kubenswrapper[4775]: I0127 11:54:30.987519 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" event={"ID":"352eaecd-6d51-4198-b3e6-ce59a6485be1","Type":"ContainerDied","Data":"0f1ef1f39ca959bb951c060526e2967aa2da81bcf3a2df13c0ff4e7031c25e4b"} Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.392330 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.429498 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-nova-metadata-neutron-config-0\") pod \"352eaecd-6d51-4198-b3e6-ce59a6485be1\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.430895 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-ssh-key-openstack-edpm-ipam\") pod \"352eaecd-6d51-4198-b3e6-ce59a6485be1\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.430998 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-inventory\") pod \"352eaecd-6d51-4198-b3e6-ce59a6485be1\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.431052 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-metadata-combined-ca-bundle\") pod \"352eaecd-6d51-4198-b3e6-ce59a6485be1\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.431106 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-ovn-metadata-agent-neutron-config-0\") pod \"352eaecd-6d51-4198-b3e6-ce59a6485be1\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.431174 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvrrj\" (UniqueName: \"kubernetes.io/projected/352eaecd-6d51-4198-b3e6-ce59a6485be1-kube-api-access-kvrrj\") pod \"352eaecd-6d51-4198-b3e6-ce59a6485be1\" (UID: \"352eaecd-6d51-4198-b3e6-ce59a6485be1\") " Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.457551 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "352eaecd-6d51-4198-b3e6-ce59a6485be1" (UID: "352eaecd-6d51-4198-b3e6-ce59a6485be1"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.459843 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/352eaecd-6d51-4198-b3e6-ce59a6485be1-kube-api-access-kvrrj" (OuterVolumeSpecName: "kube-api-access-kvrrj") pod "352eaecd-6d51-4198-b3e6-ce59a6485be1" (UID: "352eaecd-6d51-4198-b3e6-ce59a6485be1"). InnerVolumeSpecName "kube-api-access-kvrrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.460652 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "352eaecd-6d51-4198-b3e6-ce59a6485be1" (UID: "352eaecd-6d51-4198-b3e6-ce59a6485be1"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.462888 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "352eaecd-6d51-4198-b3e6-ce59a6485be1" (UID: "352eaecd-6d51-4198-b3e6-ce59a6485be1"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.463566 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "352eaecd-6d51-4198-b3e6-ce59a6485be1" (UID: "352eaecd-6d51-4198-b3e6-ce59a6485be1"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.477172 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-inventory" (OuterVolumeSpecName: "inventory") pod "352eaecd-6d51-4198-b3e6-ce59a6485be1" (UID: "352eaecd-6d51-4198-b3e6-ce59a6485be1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.533073 4775 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.533232 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.533302 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.533363 4775 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.533434 4775 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/352eaecd-6d51-4198-b3e6-ce59a6485be1-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 27 11:54:32 crc kubenswrapper[4775]: I0127 11:54:32.533542 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvrrj\" (UniqueName: \"kubernetes.io/projected/352eaecd-6d51-4198-b3e6-ce59a6485be1-kube-api-access-kvrrj\") on node \"crc\" DevicePath \"\"" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.006372 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" event={"ID":"352eaecd-6d51-4198-b3e6-ce59a6485be1","Type":"ContainerDied","Data":"124efc6595a2c07c8ba8c1e21002ee587b16c0e219e4d44dc27fd782a787b0c4"} Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.006415 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="124efc6595a2c07c8ba8c1e21002ee587b16c0e219e4d44dc27fd782a787b0c4" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.006491 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.165695 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm"] Jan 27 11:54:33 crc kubenswrapper[4775]: E0127 11:54:33.166275 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="352eaecd-6d51-4198-b3e6-ce59a6485be1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.166304 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="352eaecd-6d51-4198-b3e6-ce59a6485be1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.166573 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="352eaecd-6d51-4198-b3e6-ce59a6485be1" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.167399 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.172481 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.172791 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.173484 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.173676 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.174426 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.183927 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm"] Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.245818 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.246052 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.246150 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jthbv\" (UniqueName: \"kubernetes.io/projected/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-kube-api-access-jthbv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.246277 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.246446 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.348731 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.349195 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.349232 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.349267 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jthbv\" (UniqueName: \"kubernetes.io/projected/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-kube-api-access-jthbv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.349316 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.352924 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.354908 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.355149 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.355189 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.368415 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jthbv\" (UniqueName: \"kubernetes.io/projected/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-kube-api-access-jthbv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:33 crc kubenswrapper[4775]: I0127 11:54:33.487285 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 11:54:34 crc kubenswrapper[4775]: I0127 11:54:34.029716 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm"] Jan 27 11:54:35 crc kubenswrapper[4775]: I0127 11:54:35.027706 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" event={"ID":"7ab3ce35-77fe-4e38-ad60-c5906f6d061a","Type":"ContainerStarted","Data":"1a8cf55eb3beaba1896ff94ae83a14e9b56c32210fb2f25fc6ea5ce6da28dbfb"} Jan 27 11:54:36 crc kubenswrapper[4775]: I0127 11:54:36.036390 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" event={"ID":"7ab3ce35-77fe-4e38-ad60-c5906f6d061a","Type":"ContainerStarted","Data":"9c18ce206d2ab737b472fdfd73559615373b44cbbe3b8f6f7afb1058b247290d"} Jan 27 11:54:36 crc kubenswrapper[4775]: I0127 11:54:36.061086 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" podStartSLOduration=1.960950377 podStartE2EDuration="3.061062713s" podCreationTimestamp="2026-01-27 11:54:33 +0000 UTC" firstStartedPulling="2026-01-27 11:54:34.035956102 +0000 UTC m=+2053.177553879" lastFinishedPulling="2026-01-27 11:54:35.136068438 +0000 UTC m=+2054.277666215" observedRunningTime="2026-01-27 11:54:36.053911339 +0000 UTC m=+2055.195509126" watchObservedRunningTime="2026-01-27 11:54:36.061062713 +0000 UTC m=+2055.202660490" Jan 27 11:54:59 crc kubenswrapper[4775]: I0127 11:54:59.517867 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:54:59 crc kubenswrapper[4775]: I0127 11:54:59.518501 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:54:59 crc kubenswrapper[4775]: I0127 11:54:59.518566 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:54:59 crc kubenswrapper[4775]: I0127 11:54:59.519465 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a1dfd42e295ce83974192713f1280a3eb35fc52f0c8fcb222feb124fcbeb9753"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:54:59 crc kubenswrapper[4775]: I0127 11:54:59.519535 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://a1dfd42e295ce83974192713f1280a3eb35fc52f0c8fcb222feb124fcbeb9753" gracePeriod=600 Jan 27 11:55:00 crc kubenswrapper[4775]: I0127 11:55:00.246276 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="a1dfd42e295ce83974192713f1280a3eb35fc52f0c8fcb222feb124fcbeb9753" exitCode=0 Jan 27 11:55:00 crc kubenswrapper[4775]: I0127 11:55:00.246319 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"a1dfd42e295ce83974192713f1280a3eb35fc52f0c8fcb222feb124fcbeb9753"} Jan 27 11:55:00 crc kubenswrapper[4775]: I0127 11:55:00.246654 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5"} Jan 27 11:55:00 crc kubenswrapper[4775]: I0127 11:55:00.246693 4775 scope.go:117] "RemoveContainer" containerID="296db5adfc72083f63d71a995ca0847cc0988af65d29cc43e0920b28687d8a94" Jan 27 11:56:06 crc kubenswrapper[4775]: I0127 11:56:06.813096 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 27 11:56:11 crc kubenswrapper[4775]: I0127 11:56:11.818101 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 27 11:56:16 crc kubenswrapper[4775]: I0127 11:56:16.809657 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Jan 27 11:56:16 crc kubenswrapper[4775]: I0127 11:56:16.813334 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 27 11:56:16 crc kubenswrapper[4775]: I0127 11:56:16.813477 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Jan 27 11:56:16 crc kubenswrapper[4775]: I0127 11:56:16.815438 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"30428215fd25f2d293050de6aefc5e00ce0f54513b74c8c39065ab59e8f5dfd5"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Jan 27 11:56:16 crc kubenswrapper[4775]: I0127 11:56:16.815765 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-central-agent" containerID="cri-o://30428215fd25f2d293050de6aefc5e00ce0f54513b74c8c39065ab59e8f5dfd5" gracePeriod=30 Jan 27 11:56:22 crc kubenswrapper[4775]: I0127 11:56:22.065052 4775 generic.go:334] "Generic (PLEG): container finished" podID="f0fb6dfd-0694-418a-965e-789707762ef7" containerID="30428215fd25f2d293050de6aefc5e00ce0f54513b74c8c39065ab59e8f5dfd5" exitCode=0 Jan 27 11:56:22 crc kubenswrapper[4775]: I0127 11:56:22.065146 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerDied","Data":"30428215fd25f2d293050de6aefc5e00ce0f54513b74c8c39065ab59e8f5dfd5"} Jan 27 11:56:22 crc kubenswrapper[4775]: I0127 11:56:22.172128 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 11:56:24 crc kubenswrapper[4775]: I0127 11:56:24.111592 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerStarted","Data":"a3a1333d65d5d593f15afb4b0f08e508a777bb7a8596b72b5e166d8e425466e1"} Jan 27 11:56:59 crc kubenswrapper[4775]: I0127 11:56:59.517597 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:56:59 crc kubenswrapper[4775]: I0127 11:56:59.518423 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.495987 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8xp2v"] Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.500980 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.512469 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8xp2v"] Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.602219 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-catalog-content\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.602308 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-utilities\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.602364 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgtkp\" (UniqueName: \"kubernetes.io/projected/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-kube-api-access-zgtkp\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.704756 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-catalog-content\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.704807 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-utilities\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.704856 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgtkp\" (UniqueName: \"kubernetes.io/projected/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-kube-api-access-zgtkp\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.705430 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-catalog-content\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.705493 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-utilities\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.725374 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgtkp\" (UniqueName: \"kubernetes.io/projected/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-kube-api-access-zgtkp\") pod \"redhat-marketplace-8xp2v\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:03 crc kubenswrapper[4775]: I0127 11:57:03.878200 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:57:04 crc kubenswrapper[4775]: I0127 11:57:04.390316 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8xp2v"] Jan 27 11:57:04 crc kubenswrapper[4775]: I0127 11:57:04.539956 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xp2v" event={"ID":"1f75ce6e-5d6d-4f5c-8ac9-803632a916da","Type":"ContainerStarted","Data":"0291753de9c98da2bae990b058c91775c37c615ad32e74c62d7d9edcc6e28728"} Jan 27 11:57:16 crc kubenswrapper[4775]: I0127 11:57:16.813040 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Jan 27 11:57:29 crc kubenswrapper[4775]: I0127 11:57:29.517655 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:57:29 crc kubenswrapper[4775]: I0127 11:57:29.518438 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.189537 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rw88w"] Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.191946 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.206173 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rw88w"] Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.328725 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bjj6\" (UniqueName: \"kubernetes.io/projected/0c6c0e22-40c5-460b-bd26-97757534ba57-kube-api-access-4bjj6\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.329106 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-utilities\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.329215 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-catalog-content\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.431717 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bjj6\" (UniqueName: \"kubernetes.io/projected/0c6c0e22-40c5-460b-bd26-97757534ba57-kube-api-access-4bjj6\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.431801 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-utilities\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.431877 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-catalog-content\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.432732 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-utilities\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.432786 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-catalog-content\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.462363 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bjj6\" (UniqueName: \"kubernetes.io/projected/0c6c0e22-40c5-460b-bd26-97757534ba57-kube-api-access-4bjj6\") pod \"certified-operators-rw88w\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:31 crc kubenswrapper[4775]: I0127 11:57:31.516408 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:57:32 crc kubenswrapper[4775]: I0127 11:57:32.068208 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rw88w"] Jan 27 11:57:32 crc kubenswrapper[4775]: I0127 11:57:32.894627 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rw88w" event={"ID":"0c6c0e22-40c5-460b-bd26-97757534ba57","Type":"ContainerStarted","Data":"0aa60fb56c0c1bbe7a3921142085582cfc65147743088dfa80a0c2dcd32c2888"} Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.569986 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n28vv"] Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.573290 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.586535 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n28vv"] Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.621947 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm2qt\" (UniqueName: \"kubernetes.io/projected/3fbf29b1-8a03-401a-99b3-7e5e6334036b-kube-api-access-pm2qt\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.622059 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-utilities\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.622090 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-catalog-content\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.724061 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-utilities\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.724118 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-catalog-content\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.724272 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm2qt\" (UniqueName: \"kubernetes.io/projected/3fbf29b1-8a03-401a-99b3-7e5e6334036b-kube-api-access-pm2qt\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.725076 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-utilities\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.725100 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-catalog-content\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.747077 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm2qt\" (UniqueName: \"kubernetes.io/projected/3fbf29b1-8a03-401a-99b3-7e5e6334036b-kube-api-access-pm2qt\") pod \"redhat-operators-n28vv\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:35 crc kubenswrapper[4775]: I0127 11:57:35.893903 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:57:36 crc kubenswrapper[4775]: I0127 11:57:36.205831 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n28vv"] Jan 27 11:57:36 crc kubenswrapper[4775]: I0127 11:57:36.932160 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n28vv" event={"ID":"3fbf29b1-8a03-401a-99b3-7e5e6334036b","Type":"ContainerStarted","Data":"d1e46eda680fc26f0fdf41433ba80f7398beded8cdfb1da42d96181b8897822d"} Jan 27 11:57:39 crc kubenswrapper[4775]: I0127 11:57:39.970360 4775 generic.go:334] "Generic (PLEG): container finished" podID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerID="18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64" exitCode=0 Jan 27 11:57:39 crc kubenswrapper[4775]: I0127 11:57:39.970669 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n28vv" event={"ID":"3fbf29b1-8a03-401a-99b3-7e5e6334036b","Type":"ContainerDied","Data":"18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64"} Jan 27 11:57:39 crc kubenswrapper[4775]: I0127 11:57:39.973845 4775 generic.go:334] "Generic (PLEG): container finished" podID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerID="63ad6895a072ae0f6274d71feae46f6b9e7b51d511d23402a809bd3efe084043" exitCode=0 Jan 27 11:57:39 crc kubenswrapper[4775]: I0127 11:57:39.973939 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xp2v" event={"ID":"1f75ce6e-5d6d-4f5c-8ac9-803632a916da","Type":"ContainerDied","Data":"63ad6895a072ae0f6274d71feae46f6b9e7b51d511d23402a809bd3efe084043"} Jan 27 11:57:39 crc kubenswrapper[4775]: I0127 11:57:39.976870 4775 generic.go:334] "Generic (PLEG): container finished" podID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerID="a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76" exitCode=0 Jan 27 11:57:39 crc kubenswrapper[4775]: I0127 11:57:39.976899 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rw88w" event={"ID":"0c6c0e22-40c5-460b-bd26-97757534ba57","Type":"ContainerDied","Data":"a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76"} Jan 27 11:57:41 crc kubenswrapper[4775]: I0127 11:57:41.630050 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-notification-agent" probeResult="failure" output=< Jan 27 11:57:41 crc kubenswrapper[4775]: Unkown error: Expecting value: line 1 column 1 (char 0) Jan 27 11:57:41 crc kubenswrapper[4775]: > Jan 27 11:57:41 crc kubenswrapper[4775]: I0127 11:57:41.996782 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n28vv" event={"ID":"3fbf29b1-8a03-401a-99b3-7e5e6334036b","Type":"ContainerStarted","Data":"101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003"} Jan 27 11:57:42 crc kubenswrapper[4775]: I0127 11:57:42.010299 4775 generic.go:334] "Generic (PLEG): container finished" podID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerID="3f41f77137616cfc9a29698c06344fd248206ddb45a7e5ae68f078f59add4d33" exitCode=0 Jan 27 11:57:42 crc kubenswrapper[4775]: I0127 11:57:42.010467 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xp2v" event={"ID":"1f75ce6e-5d6d-4f5c-8ac9-803632a916da","Type":"ContainerDied","Data":"3f41f77137616cfc9a29698c06344fd248206ddb45a7e5ae68f078f59add4d33"} Jan 27 11:57:42 crc kubenswrapper[4775]: I0127 11:57:42.017522 4775 generic.go:334] "Generic (PLEG): container finished" podID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerID="cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232" exitCode=0 Jan 27 11:57:42 crc kubenswrapper[4775]: I0127 11:57:42.017613 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rw88w" event={"ID":"0c6c0e22-40c5-460b-bd26-97757534ba57","Type":"ContainerDied","Data":"cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232"} Jan 27 11:57:43 crc kubenswrapper[4775]: I0127 11:57:43.029491 4775 generic.go:334] "Generic (PLEG): container finished" podID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerID="101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003" exitCode=0 Jan 27 11:57:43 crc kubenswrapper[4775]: I0127 11:57:43.029541 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n28vv" event={"ID":"3fbf29b1-8a03-401a-99b3-7e5e6334036b","Type":"ContainerDied","Data":"101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003"} Jan 27 11:57:49 crc kubenswrapper[4775]: I0127 11:57:49.812792 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" podUID="fae72616-e516-4ce6-86b8-b28f14a92939" containerName="sbdb" probeResult="failure" output="command timed out" Jan 27 11:57:49 crc kubenswrapper[4775]: I0127 11:57:49.813926 4775 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-mzqrg" podUID="fae72616-e516-4ce6-86b8-b28f14a92939" containerName="nbdb" probeResult="failure" output="command timed out" Jan 27 11:57:59 crc kubenswrapper[4775]: I0127 11:57:59.517481 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 11:57:59 crc kubenswrapper[4775]: I0127 11:57:59.518118 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 11:57:59 crc kubenswrapper[4775]: I0127 11:57:59.518169 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 11:57:59 crc kubenswrapper[4775]: I0127 11:57:59.519055 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 11:57:59 crc kubenswrapper[4775]: I0127 11:57:59.519115 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" gracePeriod=600 Jan 27 11:58:00 crc kubenswrapper[4775]: E0127 11:58:00.155309 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:58:00 crc kubenswrapper[4775]: I0127 11:58:00.180799 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" exitCode=0 Jan 27 11:58:00 crc kubenswrapper[4775]: I0127 11:58:00.180955 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5"} Jan 27 11:58:00 crc kubenswrapper[4775]: I0127 11:58:00.181148 4775 scope.go:117] "RemoveContainer" containerID="a1dfd42e295ce83974192713f1280a3eb35fc52f0c8fcb222feb124fcbeb9753" Jan 27 11:58:00 crc kubenswrapper[4775]: I0127 11:58:00.182016 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:58:00 crc kubenswrapper[4775]: E0127 11:58:00.182402 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.193679 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rw88w" event={"ID":"0c6c0e22-40c5-460b-bd26-97757534ba57","Type":"ContainerStarted","Data":"6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9"} Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.196085 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n28vv" event={"ID":"3fbf29b1-8a03-401a-99b3-7e5e6334036b","Type":"ContainerStarted","Data":"d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2"} Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.197996 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xp2v" event={"ID":"1f75ce6e-5d6d-4f5c-8ac9-803632a916da","Type":"ContainerStarted","Data":"267f2badda022e36c2d1823938582d163ff75cd57ead4dfe95643ae7950e1308"} Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.219293 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rw88w" podStartSLOduration=9.953232602 podStartE2EDuration="30.219272397s" podCreationTimestamp="2026-01-27 11:57:31 +0000 UTC" firstStartedPulling="2026-01-27 11:57:39.977914957 +0000 UTC m=+2239.119512734" lastFinishedPulling="2026-01-27 11:58:00.243954752 +0000 UTC m=+2259.385552529" observedRunningTime="2026-01-27 11:58:01.216895592 +0000 UTC m=+2260.358493389" watchObservedRunningTime="2026-01-27 11:58:01.219272397 +0000 UTC m=+2260.360870174" Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.238034 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n28vv" podStartSLOduration=6.0854180509999996 podStartE2EDuration="26.238015313s" podCreationTimestamp="2026-01-27 11:57:35 +0000 UTC" firstStartedPulling="2026-01-27 11:57:39.972373648 +0000 UTC m=+2239.113971425" lastFinishedPulling="2026-01-27 11:58:00.12497091 +0000 UTC m=+2259.266568687" observedRunningTime="2026-01-27 11:58:01.234850447 +0000 UTC m=+2260.376448224" watchObservedRunningTime="2026-01-27 11:58:01.238015313 +0000 UTC m=+2260.379613090" Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.254292 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8xp2v" podStartSLOduration=38.008557906 podStartE2EDuration="58.254270932s" podCreationTimestamp="2026-01-27 11:57:03 +0000 UTC" firstStartedPulling="2026-01-27 11:57:39.976397057 +0000 UTC m=+2239.117994834" lastFinishedPulling="2026-01-27 11:58:00.222110083 +0000 UTC m=+2259.363707860" observedRunningTime="2026-01-27 11:58:01.254250651 +0000 UTC m=+2260.395848438" watchObservedRunningTime="2026-01-27 11:58:01.254270932 +0000 UTC m=+2260.395868709" Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.516937 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:58:01 crc kubenswrapper[4775]: I0127 11:58:01.516988 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:58:02 crc kubenswrapper[4775]: I0127 11:58:02.559835 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rw88w" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="registry-server" probeResult="failure" output=< Jan 27 11:58:02 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 11:58:02 crc kubenswrapper[4775]: > Jan 27 11:58:03 crc kubenswrapper[4775]: I0127 11:58:03.879487 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:58:03 crc kubenswrapper[4775]: I0127 11:58:03.879572 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:58:03 crc kubenswrapper[4775]: I0127 11:58:03.932329 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:58:05 crc kubenswrapper[4775]: I0127 11:58:05.894231 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:58:05 crc kubenswrapper[4775]: I0127 11:58:05.895832 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:58:06 crc kubenswrapper[4775]: I0127 11:58:06.937029 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n28vv" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="registry-server" probeResult="failure" output=< Jan 27 11:58:06 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 11:58:06 crc kubenswrapper[4775]: > Jan 27 11:58:11 crc kubenswrapper[4775]: I0127 11:58:11.563661 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:58:11 crc kubenswrapper[4775]: I0127 11:58:11.613195 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:58:11 crc kubenswrapper[4775]: I0127 11:58:11.647845 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-notification-agent" probeResult="failure" output=< Jan 27 11:58:11 crc kubenswrapper[4775]: Unkown error: Expecting value: line 1 column 1 (char 0) Jan 27 11:58:11 crc kubenswrapper[4775]: > Jan 27 11:58:11 crc kubenswrapper[4775]: I0127 11:58:11.647931 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Jan 27 11:58:11 crc kubenswrapper[4775]: I0127 11:58:11.649549 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-notification-agent" containerStatusID={"Type":"cri-o","ID":"aea3181cf116bae455f41b1366597b119efc1371f74ffae26f9a4168156cbb13"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-notification-agent failed liveness probe, will be restarted" Jan 27 11:58:11 crc kubenswrapper[4775]: I0127 11:58:11.649624 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-notification-agent" containerID="cri-o://aea3181cf116bae455f41b1366597b119efc1371f74ffae26f9a4168156cbb13" gracePeriod=30 Jan 27 11:58:11 crc kubenswrapper[4775]: I0127 11:58:11.806934 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rw88w"] Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.313049 4775 generic.go:334] "Generic (PLEG): container finished" podID="f0fb6dfd-0694-418a-965e-789707762ef7" containerID="aea3181cf116bae455f41b1366597b119efc1371f74ffae26f9a4168156cbb13" exitCode=0 Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.313135 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerDied","Data":"aea3181cf116bae455f41b1366597b119efc1371f74ffae26f9a4168156cbb13"} Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.313436 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rw88w" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="registry-server" containerID="cri-o://6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9" gracePeriod=2 Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.795930 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.937103 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.979322 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-utilities\") pod \"0c6c0e22-40c5-460b-bd26-97757534ba57\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.979614 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bjj6\" (UniqueName: \"kubernetes.io/projected/0c6c0e22-40c5-460b-bd26-97757534ba57-kube-api-access-4bjj6\") pod \"0c6c0e22-40c5-460b-bd26-97757534ba57\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.979753 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-catalog-content\") pod \"0c6c0e22-40c5-460b-bd26-97757534ba57\" (UID: \"0c6c0e22-40c5-460b-bd26-97757534ba57\") " Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.979995 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-utilities" (OuterVolumeSpecName: "utilities") pod "0c6c0e22-40c5-460b-bd26-97757534ba57" (UID: "0c6c0e22-40c5-460b-bd26-97757534ba57"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.981030 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:13 crc kubenswrapper[4775]: I0127 11:58:13.989819 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c6c0e22-40c5-460b-bd26-97757534ba57-kube-api-access-4bjj6" (OuterVolumeSpecName: "kube-api-access-4bjj6") pod "0c6c0e22-40c5-460b-bd26-97757534ba57" (UID: "0c6c0e22-40c5-460b-bd26-97757534ba57"). InnerVolumeSpecName "kube-api-access-4bjj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.027279 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c6c0e22-40c5-460b-bd26-97757534ba57" (UID: "0c6c0e22-40c5-460b-bd26-97757534ba57"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.083129 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bjj6\" (UniqueName: \"kubernetes.io/projected/0c6c0e22-40c5-460b-bd26-97757534ba57-kube-api-access-4bjj6\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.083360 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c6c0e22-40c5-460b-bd26-97757534ba57-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.330427 4775 generic.go:334] "Generic (PLEG): container finished" podID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerID="6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9" exitCode=0 Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.330480 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rw88w" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.330496 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rw88w" event={"ID":"0c6c0e22-40c5-460b-bd26-97757534ba57","Type":"ContainerDied","Data":"6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9"} Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.330979 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rw88w" event={"ID":"0c6c0e22-40c5-460b-bd26-97757534ba57","Type":"ContainerDied","Data":"0aa60fb56c0c1bbe7a3921142085582cfc65147743088dfa80a0c2dcd32c2888"} Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.331008 4775 scope.go:117] "RemoveContainer" containerID="6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.341987 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0fb6dfd-0694-418a-965e-789707762ef7","Type":"ContainerStarted","Data":"4aa92910d4fd30d50d0a609af387150a1c8121886282da00160ed4a2d0b4ef35"} Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.357066 4775 scope.go:117] "RemoveContainer" containerID="cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.410212 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rw88w"] Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.411051 4775 scope.go:117] "RemoveContainer" containerID="a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.420339 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rw88w"] Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.438680 4775 scope.go:117] "RemoveContainer" containerID="6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9" Jan 27 11:58:14 crc kubenswrapper[4775]: E0127 11:58:14.439393 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9\": container with ID starting with 6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9 not found: ID does not exist" containerID="6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.439441 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9"} err="failed to get container status \"6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9\": rpc error: code = NotFound desc = could not find container \"6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9\": container with ID starting with 6757287789f154c25051e4e58141c4ea2706a06d3d545ad3f8b534fb9d7881a9 not found: ID does not exist" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.439504 4775 scope.go:117] "RemoveContainer" containerID="cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232" Jan 27 11:58:14 crc kubenswrapper[4775]: E0127 11:58:14.440085 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232\": container with ID starting with cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232 not found: ID does not exist" containerID="cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.440153 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232"} err="failed to get container status \"cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232\": rpc error: code = NotFound desc = could not find container \"cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232\": container with ID starting with cb738762ad0069c00045f2d2c994c54dc0fe6bf5c71a5dfdc29f62d1c13fe232 not found: ID does not exist" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.440202 4775 scope.go:117] "RemoveContainer" containerID="a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76" Jan 27 11:58:14 crc kubenswrapper[4775]: E0127 11:58:14.440723 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76\": container with ID starting with a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76 not found: ID does not exist" containerID="a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76" Jan 27 11:58:14 crc kubenswrapper[4775]: I0127 11:58:14.440750 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76"} err="failed to get container status \"a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76\": rpc error: code = NotFound desc = could not find container \"a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76\": container with ID starting with a12fb35161064a1fb6c7ba1ba387db70b17ea95a59223fed3e8af918ec283d76 not found: ID does not exist" Jan 27 11:58:15 crc kubenswrapper[4775]: I0127 11:58:15.745528 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:58:15 crc kubenswrapper[4775]: E0127 11:58:15.745785 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:58:15 crc kubenswrapper[4775]: I0127 11:58:15.756526 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" path="/var/lib/kubelet/pods/0c6c0e22-40c5-460b-bd26-97757534ba57/volumes" Jan 27 11:58:15 crc kubenswrapper[4775]: I0127 11:58:15.812779 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8xp2v"] Jan 27 11:58:15 crc kubenswrapper[4775]: I0127 11:58:15.813055 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8xp2v" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="registry-server" containerID="cri-o://267f2badda022e36c2d1823938582d163ff75cd57ead4dfe95643ae7950e1308" gracePeriod=2 Jan 27 11:58:16 crc kubenswrapper[4775]: I0127 11:58:16.362872 4775 generic.go:334] "Generic (PLEG): container finished" podID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerID="267f2badda022e36c2d1823938582d163ff75cd57ead4dfe95643ae7950e1308" exitCode=0 Jan 27 11:58:16 crc kubenswrapper[4775]: I0127 11:58:16.362908 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xp2v" event={"ID":"1f75ce6e-5d6d-4f5c-8ac9-803632a916da","Type":"ContainerDied","Data":"267f2badda022e36c2d1823938582d163ff75cd57ead4dfe95643ae7950e1308"} Jan 27 11:58:16 crc kubenswrapper[4775]: I0127 11:58:16.904287 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:58:16 crc kubenswrapper[4775]: I0127 11:58:16.950526 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n28vv" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="registry-server" probeResult="failure" output=< Jan 27 11:58:16 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 11:58:16 crc kubenswrapper[4775]: > Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.040756 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgtkp\" (UniqueName: \"kubernetes.io/projected/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-kube-api-access-zgtkp\") pod \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.040878 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-catalog-content\") pod \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.040978 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-utilities\") pod \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\" (UID: \"1f75ce6e-5d6d-4f5c-8ac9-803632a916da\") " Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.041749 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-utilities" (OuterVolumeSpecName: "utilities") pod "1f75ce6e-5d6d-4f5c-8ac9-803632a916da" (UID: "1f75ce6e-5d6d-4f5c-8ac9-803632a916da"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.047089 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-kube-api-access-zgtkp" (OuterVolumeSpecName: "kube-api-access-zgtkp") pod "1f75ce6e-5d6d-4f5c-8ac9-803632a916da" (UID: "1f75ce6e-5d6d-4f5c-8ac9-803632a916da"). InnerVolumeSpecName "kube-api-access-zgtkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.067136 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f75ce6e-5d6d-4f5c-8ac9-803632a916da" (UID: "1f75ce6e-5d6d-4f5c-8ac9-803632a916da"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.143117 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgtkp\" (UniqueName: \"kubernetes.io/projected/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-kube-api-access-zgtkp\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.143158 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.143171 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f75ce6e-5d6d-4f5c-8ac9-803632a916da-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.378779 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8xp2v" event={"ID":"1f75ce6e-5d6d-4f5c-8ac9-803632a916da","Type":"ContainerDied","Data":"0291753de9c98da2bae990b058c91775c37c615ad32e74c62d7d9edcc6e28728"} Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.378864 4775 scope.go:117] "RemoveContainer" containerID="267f2badda022e36c2d1823938582d163ff75cd57ead4dfe95643ae7950e1308" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.379068 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8xp2v" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.419935 4775 scope.go:117] "RemoveContainer" containerID="3f41f77137616cfc9a29698c06344fd248206ddb45a7e5ae68f078f59add4d33" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.422443 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8xp2v"] Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.431231 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8xp2v"] Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.440317 4775 scope.go:117] "RemoveContainer" containerID="63ad6895a072ae0f6274d71feae46f6b9e7b51d511d23402a809bd3efe084043" Jan 27 11:58:17 crc kubenswrapper[4775]: I0127 11:58:17.755282 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" path="/var/lib/kubelet/pods/1f75ce6e-5d6d-4f5c-8ac9-803632a916da/volumes" Jan 27 11:58:25 crc kubenswrapper[4775]: I0127 11:58:25.953497 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:58:26 crc kubenswrapper[4775]: I0127 11:58:26.024954 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:58:26 crc kubenswrapper[4775]: I0127 11:58:26.085425 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n28vv"] Jan 27 11:58:27 crc kubenswrapper[4775]: I0127 11:58:27.471516 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n28vv" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="registry-server" containerID="cri-o://d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2" gracePeriod=2 Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.451651 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.495193 4775 generic.go:334] "Generic (PLEG): container finished" podID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerID="d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2" exitCode=0 Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.495239 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n28vv" event={"ID":"3fbf29b1-8a03-401a-99b3-7e5e6334036b","Type":"ContainerDied","Data":"d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2"} Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.495303 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n28vv" event={"ID":"3fbf29b1-8a03-401a-99b3-7e5e6334036b","Type":"ContainerDied","Data":"d1e46eda680fc26f0fdf41433ba80f7398beded8cdfb1da42d96181b8897822d"} Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.495325 4775 scope.go:117] "RemoveContainer" containerID="d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.495494 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n28vv" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.518935 4775 scope.go:117] "RemoveContainer" containerID="101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.551695 4775 scope.go:117] "RemoveContainer" containerID="18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.566621 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-utilities\") pod \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.567328 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-catalog-content\") pod \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.567540 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm2qt\" (UniqueName: \"kubernetes.io/projected/3fbf29b1-8a03-401a-99b3-7e5e6334036b-kube-api-access-pm2qt\") pod \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\" (UID: \"3fbf29b1-8a03-401a-99b3-7e5e6334036b\") " Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.570134 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-utilities" (OuterVolumeSpecName: "utilities") pod "3fbf29b1-8a03-401a-99b3-7e5e6334036b" (UID: "3fbf29b1-8a03-401a-99b3-7e5e6334036b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.580837 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fbf29b1-8a03-401a-99b3-7e5e6334036b-kube-api-access-pm2qt" (OuterVolumeSpecName: "kube-api-access-pm2qt") pod "3fbf29b1-8a03-401a-99b3-7e5e6334036b" (UID: "3fbf29b1-8a03-401a-99b3-7e5e6334036b"). InnerVolumeSpecName "kube-api-access-pm2qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.624824 4775 scope.go:117] "RemoveContainer" containerID="d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2" Jan 27 11:58:28 crc kubenswrapper[4775]: E0127 11:58:28.628661 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2\": container with ID starting with d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2 not found: ID does not exist" containerID="d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.628747 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2"} err="failed to get container status \"d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2\": rpc error: code = NotFound desc = could not find container \"d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2\": container with ID starting with d375671f67640bfb7c0f7fd791bf72f58eeb3d082e4405101f1fa301539e08c2 not found: ID does not exist" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.628794 4775 scope.go:117] "RemoveContainer" containerID="101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003" Jan 27 11:58:28 crc kubenswrapper[4775]: E0127 11:58:28.629292 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003\": container with ID starting with 101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003 not found: ID does not exist" containerID="101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.629326 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003"} err="failed to get container status \"101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003\": rpc error: code = NotFound desc = could not find container \"101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003\": container with ID starting with 101664d54a6cc0d68ddcc0d9e7e6085d23dc6983b4e64297513dcd3c53935003 not found: ID does not exist" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.629347 4775 scope.go:117] "RemoveContainer" containerID="18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64" Jan 27 11:58:28 crc kubenswrapper[4775]: E0127 11:58:28.629753 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64\": container with ID starting with 18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64 not found: ID does not exist" containerID="18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.629805 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64"} err="failed to get container status \"18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64\": rpc error: code = NotFound desc = could not find container \"18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64\": container with ID starting with 18141fdda29da5e0bf4a08bd02c1275405e245bfa2ae0f9b966af81fb8948a64 not found: ID does not exist" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.671230 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.671277 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm2qt\" (UniqueName: \"kubernetes.io/projected/3fbf29b1-8a03-401a-99b3-7e5e6334036b-kube-api-access-pm2qt\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.712951 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3fbf29b1-8a03-401a-99b3-7e5e6334036b" (UID: "3fbf29b1-8a03-401a-99b3-7e5e6334036b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.773071 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3fbf29b1-8a03-401a-99b3-7e5e6334036b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.841856 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n28vv"] Jan 27 11:58:28 crc kubenswrapper[4775]: I0127 11:58:28.852060 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n28vv"] Jan 27 11:58:29 crc kubenswrapper[4775]: I0127 11:58:29.746135 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:58:29 crc kubenswrapper[4775]: E0127 11:58:29.746508 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:58:29 crc kubenswrapper[4775]: I0127 11:58:29.756533 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" path="/var/lib/kubelet/pods/3fbf29b1-8a03-401a-99b3-7e5e6334036b/volumes" Jan 27 11:58:40 crc kubenswrapper[4775]: I0127 11:58:40.744976 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:58:40 crc kubenswrapper[4775]: E0127 11:58:40.746410 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:58:51 crc kubenswrapper[4775]: I0127 11:58:51.750938 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:58:51 crc kubenswrapper[4775]: E0127 11:58:51.751817 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:59:02 crc kubenswrapper[4775]: I0127 11:59:02.745011 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:59:02 crc kubenswrapper[4775]: E0127 11:59:02.745974 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:59:14 crc kubenswrapper[4775]: I0127 11:59:14.744827 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:59:14 crc kubenswrapper[4775]: E0127 11:59:14.745728 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:59:28 crc kubenswrapper[4775]: I0127 11:59:28.744781 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:59:28 crc kubenswrapper[4775]: E0127 11:59:28.745570 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:59:41 crc kubenswrapper[4775]: I0127 11:59:41.760826 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:59:41 crc kubenswrapper[4775]: E0127 11:59:41.761791 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 11:59:53 crc kubenswrapper[4775]: I0127 11:59:53.747660 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 11:59:53 crc kubenswrapper[4775]: E0127 11:59:53.748430 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.156247 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts"] Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157428 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="extract-content" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157443 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="extract-content" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157479 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="extract-utilities" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157488 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="extract-utilities" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157513 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157521 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157535 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="extract-utilities" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157542 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="extract-utilities" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157556 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="extract-utilities" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157563 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="extract-utilities" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157600 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157608 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157629 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="extract-content" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157637 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="extract-content" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157658 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="extract-content" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157666 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="extract-content" Jan 27 12:00:00 crc kubenswrapper[4775]: E0127 12:00:00.157678 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157695 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157908 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f75ce6e-5d6d-4f5c-8ac9-803632a916da" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157928 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fbf29b1-8a03-401a-99b3-7e5e6334036b" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.157950 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c6c0e22-40c5-460b-bd26-97757534ba57" containerName="registry-server" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.158829 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.162024 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.165205 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.183331 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts"] Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.341412 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgnnt\" (UniqueName: \"kubernetes.io/projected/5c054560-ca6e-4a4f-8116-df9beff95ec2-kube-api-access-bgnnt\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.341516 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c054560-ca6e-4a4f-8116-df9beff95ec2-secret-volume\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.341656 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c054560-ca6e-4a4f-8116-df9beff95ec2-config-volume\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.443382 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgnnt\" (UniqueName: \"kubernetes.io/projected/5c054560-ca6e-4a4f-8116-df9beff95ec2-kube-api-access-bgnnt\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.443499 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c054560-ca6e-4a4f-8116-df9beff95ec2-secret-volume\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.443534 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c054560-ca6e-4a4f-8116-df9beff95ec2-config-volume\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.445418 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c054560-ca6e-4a4f-8116-df9beff95ec2-config-volume\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.451016 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c054560-ca6e-4a4f-8116-df9beff95ec2-secret-volume\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.459770 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgnnt\" (UniqueName: \"kubernetes.io/projected/5c054560-ca6e-4a4f-8116-df9beff95ec2-kube-api-access-bgnnt\") pod \"collect-profiles-29491920-kk5ts\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.489801 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:00 crc kubenswrapper[4775]: I0127 12:00:00.918452 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts"] Jan 27 12:00:01 crc kubenswrapper[4775]: I0127 12:00:01.315562 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" event={"ID":"5c054560-ca6e-4a4f-8116-df9beff95ec2","Type":"ContainerStarted","Data":"f8ee5ec2a726ff9b461ea7b1e669ac7406fde32c15888e6e019d6dcdec7fc3a5"} Jan 27 12:00:01 crc kubenswrapper[4775]: I0127 12:00:01.316648 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" event={"ID":"5c054560-ca6e-4a4f-8116-df9beff95ec2","Type":"ContainerStarted","Data":"a72d4ae538fb5c49345cefca37ff75cc77a0f4e50afcf8e9121cb652ee642d88"} Jan 27 12:00:01 crc kubenswrapper[4775]: I0127 12:00:01.335676 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" podStartSLOduration=1.335660659 podStartE2EDuration="1.335660659s" podCreationTimestamp="2026-01-27 12:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 12:00:01.334054504 +0000 UTC m=+2380.475652281" watchObservedRunningTime="2026-01-27 12:00:01.335660659 +0000 UTC m=+2380.477258436" Jan 27 12:00:02 crc kubenswrapper[4775]: I0127 12:00:02.325502 4775 generic.go:334] "Generic (PLEG): container finished" podID="5c054560-ca6e-4a4f-8116-df9beff95ec2" containerID="f8ee5ec2a726ff9b461ea7b1e669ac7406fde32c15888e6e019d6dcdec7fc3a5" exitCode=0 Jan 27 12:00:02 crc kubenswrapper[4775]: I0127 12:00:02.325606 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" event={"ID":"5c054560-ca6e-4a4f-8116-df9beff95ec2","Type":"ContainerDied","Data":"f8ee5ec2a726ff9b461ea7b1e669ac7406fde32c15888e6e019d6dcdec7fc3a5"} Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.608072 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.699750 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c054560-ca6e-4a4f-8116-df9beff95ec2-secret-volume\") pod \"5c054560-ca6e-4a4f-8116-df9beff95ec2\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.700194 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgnnt\" (UniqueName: \"kubernetes.io/projected/5c054560-ca6e-4a4f-8116-df9beff95ec2-kube-api-access-bgnnt\") pod \"5c054560-ca6e-4a4f-8116-df9beff95ec2\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.701063 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c054560-ca6e-4a4f-8116-df9beff95ec2-config-volume\") pod \"5c054560-ca6e-4a4f-8116-df9beff95ec2\" (UID: \"5c054560-ca6e-4a4f-8116-df9beff95ec2\") " Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.701598 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c054560-ca6e-4a4f-8116-df9beff95ec2-config-volume" (OuterVolumeSpecName: "config-volume") pod "5c054560-ca6e-4a4f-8116-df9beff95ec2" (UID: "5c054560-ca6e-4a4f-8116-df9beff95ec2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.702337 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5c054560-ca6e-4a4f-8116-df9beff95ec2-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.706553 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c054560-ca6e-4a4f-8116-df9beff95ec2-kube-api-access-bgnnt" (OuterVolumeSpecName: "kube-api-access-bgnnt") pod "5c054560-ca6e-4a4f-8116-df9beff95ec2" (UID: "5c054560-ca6e-4a4f-8116-df9beff95ec2"). InnerVolumeSpecName "kube-api-access-bgnnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.706653 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c054560-ca6e-4a4f-8116-df9beff95ec2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5c054560-ca6e-4a4f-8116-df9beff95ec2" (UID: "5c054560-ca6e-4a4f-8116-df9beff95ec2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.804310 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5c054560-ca6e-4a4f-8116-df9beff95ec2-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:03 crc kubenswrapper[4775]: I0127 12:00:03.804350 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgnnt\" (UniqueName: \"kubernetes.io/projected/5c054560-ca6e-4a4f-8116-df9beff95ec2-kube-api-access-bgnnt\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:04 crc kubenswrapper[4775]: I0127 12:00:04.348900 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" event={"ID":"5c054560-ca6e-4a4f-8116-df9beff95ec2","Type":"ContainerDied","Data":"a72d4ae538fb5c49345cefca37ff75cc77a0f4e50afcf8e9121cb652ee642d88"} Jan 27 12:00:04 crc kubenswrapper[4775]: I0127 12:00:04.348968 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491920-kk5ts" Jan 27 12:00:04 crc kubenswrapper[4775]: I0127 12:00:04.349501 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a72d4ae538fb5c49345cefca37ff75cc77a0f4e50afcf8e9121cb652ee642d88" Jan 27 12:00:04 crc kubenswrapper[4775]: I0127 12:00:04.416076 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv"] Jan 27 12:00:04 crc kubenswrapper[4775]: I0127 12:00:04.425618 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491875-pj2rv"] Jan 27 12:00:05 crc kubenswrapper[4775]: I0127 12:00:05.745882 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:00:05 crc kubenswrapper[4775]: E0127 12:00:05.746415 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:00:05 crc kubenswrapper[4775]: I0127 12:00:05.764071 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04906ea0-5e8b-4e8b-8f20-c46587da8346" path="/var/lib/kubelet/pods/04906ea0-5e8b-4e8b-8f20-c46587da8346/volumes" Jan 27 12:00:16 crc kubenswrapper[4775]: I0127 12:00:16.745139 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:00:16 crc kubenswrapper[4775]: E0127 12:00:16.745920 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:00:27 crc kubenswrapper[4775]: I0127 12:00:27.744807 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:00:27 crc kubenswrapper[4775]: E0127 12:00:27.745651 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:00:40 crc kubenswrapper[4775]: I0127 12:00:40.745966 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:00:40 crc kubenswrapper[4775]: E0127 12:00:40.747081 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:00:41 crc kubenswrapper[4775]: I0127 12:00:41.669475 4775 generic.go:334] "Generic (PLEG): container finished" podID="7ab3ce35-77fe-4e38-ad60-c5906f6d061a" containerID="9c18ce206d2ab737b472fdfd73559615373b44cbbe3b8f6f7afb1058b247290d" exitCode=0 Jan 27 12:00:41 crc kubenswrapper[4775]: I0127 12:00:41.669584 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" event={"ID":"7ab3ce35-77fe-4e38-ad60-c5906f6d061a","Type":"ContainerDied","Data":"9c18ce206d2ab737b472fdfd73559615373b44cbbe3b8f6f7afb1058b247290d"} Jan 27 12:00:42 crc kubenswrapper[4775]: I0127 12:00:42.166951 4775 scope.go:117] "RemoveContainer" containerID="bee43132c84a9e322e462c0d4b4b214665e4a0e6c90cb849008c237820eb6817" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.150122 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.341842 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-combined-ca-bundle\") pod \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.341904 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-secret-0\") pod \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.341934 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jthbv\" (UniqueName: \"kubernetes.io/projected/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-kube-api-access-jthbv\") pod \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.342203 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-inventory\") pod \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.342235 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-ssh-key-openstack-edpm-ipam\") pod \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\" (UID: \"7ab3ce35-77fe-4e38-ad60-c5906f6d061a\") " Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.354735 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7ab3ce35-77fe-4e38-ad60-c5906f6d061a" (UID: "7ab3ce35-77fe-4e38-ad60-c5906f6d061a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.354837 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-kube-api-access-jthbv" (OuterVolumeSpecName: "kube-api-access-jthbv") pod "7ab3ce35-77fe-4e38-ad60-c5906f6d061a" (UID: "7ab3ce35-77fe-4e38-ad60-c5906f6d061a"). InnerVolumeSpecName "kube-api-access-jthbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.379879 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-inventory" (OuterVolumeSpecName: "inventory") pod "7ab3ce35-77fe-4e38-ad60-c5906f6d061a" (UID: "7ab3ce35-77fe-4e38-ad60-c5906f6d061a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.379962 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7ab3ce35-77fe-4e38-ad60-c5906f6d061a" (UID: "7ab3ce35-77fe-4e38-ad60-c5906f6d061a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.380314 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "7ab3ce35-77fe-4e38-ad60-c5906f6d061a" (UID: "7ab3ce35-77fe-4e38-ad60-c5906f6d061a"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.444392 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.444438 4775 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.444465 4775 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.444474 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jthbv\" (UniqueName: \"kubernetes.io/projected/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-kube-api-access-jthbv\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.444484 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7ab3ce35-77fe-4e38-ad60-c5906f6d061a-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.691544 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" event={"ID":"7ab3ce35-77fe-4e38-ad60-c5906f6d061a","Type":"ContainerDied","Data":"1a8cf55eb3beaba1896ff94ae83a14e9b56c32210fb2f25fc6ea5ce6da28dbfb"} Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.691596 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a8cf55eb3beaba1896ff94ae83a14e9b56c32210fb2f25fc6ea5ce6da28dbfb" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.691620 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.808924 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2"] Jan 27 12:00:43 crc kubenswrapper[4775]: E0127 12:00:43.809516 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ab3ce35-77fe-4e38-ad60-c5906f6d061a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.809540 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ab3ce35-77fe-4e38-ad60-c5906f6d061a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 27 12:00:43 crc kubenswrapper[4775]: E0127 12:00:43.809564 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c054560-ca6e-4a4f-8116-df9beff95ec2" containerName="collect-profiles" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.809573 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c054560-ca6e-4a4f-8116-df9beff95ec2" containerName="collect-profiles" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.809823 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c054560-ca6e-4a4f-8116-df9beff95ec2" containerName="collect-profiles" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.809846 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ab3ce35-77fe-4e38-ad60-c5906f6d061a" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.810628 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.813437 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.813460 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.813665 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.814283 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.814479 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.814688 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.816727 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.822004 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2"] Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954212 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954291 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz55c\" (UniqueName: \"kubernetes.io/projected/36bee79d-4a97-407b-9907-87d740929ba0-kube-api-access-cz55c\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954341 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/36bee79d-4a97-407b-9907-87d740929ba0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954368 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954437 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954485 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954511 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954574 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:43 crc kubenswrapper[4775]: I0127 12:00:43.954597 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.055960 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.056033 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz55c\" (UniqueName: \"kubernetes.io/projected/36bee79d-4a97-407b-9907-87d740929ba0-kube-api-access-cz55c\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.056080 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/36bee79d-4a97-407b-9907-87d740929ba0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.056108 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.056177 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.057301 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/36bee79d-4a97-407b-9907-87d740929ba0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.057482 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.057532 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.057613 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.057637 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.062371 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.063138 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.063756 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.064074 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.064246 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.065197 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.066050 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.072608 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz55c\" (UniqueName: \"kubernetes.io/projected/36bee79d-4a97-407b-9907-87d740929ba0-kube-api-access-cz55c\") pod \"nova-edpm-deployment-openstack-edpm-ipam-27lb2\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.139070 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:00:44 crc kubenswrapper[4775]: I0127 12:00:44.763298 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2"] Jan 27 12:00:45 crc kubenswrapper[4775]: I0127 12:00:45.715084 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" event={"ID":"36bee79d-4a97-407b-9907-87d740929ba0","Type":"ContainerStarted","Data":"c0f0e60fd0308f16eb0ba529574f44937b12975ddccb5ca3ee21176c89d848a1"} Jan 27 12:00:46 crc kubenswrapper[4775]: I0127 12:00:46.737917 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" event={"ID":"36bee79d-4a97-407b-9907-87d740929ba0","Type":"ContainerStarted","Data":"0d10c4e2f56c82fb2bd1a827f287080bf4533c4f837f84b4f88c1945f15b20ca"} Jan 27 12:00:46 crc kubenswrapper[4775]: I0127 12:00:46.798259 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" podStartSLOduration=2.823315264 podStartE2EDuration="3.798222252s" podCreationTimestamp="2026-01-27 12:00:43 +0000 UTC" firstStartedPulling="2026-01-27 12:00:44.766127084 +0000 UTC m=+2423.907724861" lastFinishedPulling="2026-01-27 12:00:45.741034072 +0000 UTC m=+2424.882631849" observedRunningTime="2026-01-27 12:00:46.779326306 +0000 UTC m=+2425.920924103" watchObservedRunningTime="2026-01-27 12:00:46.798222252 +0000 UTC m=+2425.939820029" Jan 27 12:00:55 crc kubenswrapper[4775]: I0127 12:00:55.745586 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:00:55 crc kubenswrapper[4775]: E0127 12:00:55.746820 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.142463 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29491921-2bnsm"] Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.143770 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.161246 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29491921-2bnsm"] Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.217289 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppd4m\" (UniqueName: \"kubernetes.io/projected/5ce874bb-50b0-4a56-a322-f5590c1d19bd-kube-api-access-ppd4m\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.217387 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-combined-ca-bundle\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.217472 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-config-data\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.217532 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-fernet-keys\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.319606 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppd4m\" (UniqueName: \"kubernetes.io/projected/5ce874bb-50b0-4a56-a322-f5590c1d19bd-kube-api-access-ppd4m\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.319670 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-combined-ca-bundle\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.319715 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-config-data\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.319754 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-fernet-keys\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.326696 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-config-data\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.336538 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-fernet-keys\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.337212 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-combined-ca-bundle\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.339389 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppd4m\" (UniqueName: \"kubernetes.io/projected/5ce874bb-50b0-4a56-a322-f5590c1d19bd-kube-api-access-ppd4m\") pod \"keystone-cron-29491921-2bnsm\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.474500 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:00 crc kubenswrapper[4775]: I0127 12:01:00.902303 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29491921-2bnsm"] Jan 27 12:01:00 crc kubenswrapper[4775]: W0127 12:01:00.908994 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ce874bb_50b0_4a56_a322_f5590c1d19bd.slice/crio-865df426ea7712199666a8f9d8149f7edb5f7fce0de03d0e0d34bbada43b2d98 WatchSource:0}: Error finding container 865df426ea7712199666a8f9d8149f7edb5f7fce0de03d0e0d34bbada43b2d98: Status 404 returned error can't find the container with id 865df426ea7712199666a8f9d8149f7edb5f7fce0de03d0e0d34bbada43b2d98 Jan 27 12:01:01 crc kubenswrapper[4775]: I0127 12:01:01.902803 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29491921-2bnsm" event={"ID":"5ce874bb-50b0-4a56-a322-f5590c1d19bd","Type":"ContainerStarted","Data":"9f0c98731d688a69835a1ca60f98f70ee8474a138e88b3f3711d909f34cc3985"} Jan 27 12:01:01 crc kubenswrapper[4775]: I0127 12:01:01.903464 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29491921-2bnsm" event={"ID":"5ce874bb-50b0-4a56-a322-f5590c1d19bd","Type":"ContainerStarted","Data":"865df426ea7712199666a8f9d8149f7edb5f7fce0de03d0e0d34bbada43b2d98"} Jan 27 12:01:01 crc kubenswrapper[4775]: I0127 12:01:01.923471 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29491921-2bnsm" podStartSLOduration=1.9234339280000001 podStartE2EDuration="1.923433928s" podCreationTimestamp="2026-01-27 12:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 12:01:01.921799695 +0000 UTC m=+2441.063397502" watchObservedRunningTime="2026-01-27 12:01:01.923433928 +0000 UTC m=+2441.065031705" Jan 27 12:01:03 crc kubenswrapper[4775]: I0127 12:01:03.925445 4775 generic.go:334] "Generic (PLEG): container finished" podID="5ce874bb-50b0-4a56-a322-f5590c1d19bd" containerID="9f0c98731d688a69835a1ca60f98f70ee8474a138e88b3f3711d909f34cc3985" exitCode=0 Jan 27 12:01:03 crc kubenswrapper[4775]: I0127 12:01:03.925516 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29491921-2bnsm" event={"ID":"5ce874bb-50b0-4a56-a322-f5590c1d19bd","Type":"ContainerDied","Data":"9f0c98731d688a69835a1ca60f98f70ee8474a138e88b3f3711d909f34cc3985"} Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.263007 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.316089 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppd4m\" (UniqueName: \"kubernetes.io/projected/5ce874bb-50b0-4a56-a322-f5590c1d19bd-kube-api-access-ppd4m\") pod \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.316195 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-combined-ca-bundle\") pod \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.316247 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-fernet-keys\") pod \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.316341 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-config-data\") pod \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\" (UID: \"5ce874bb-50b0-4a56-a322-f5590c1d19bd\") " Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.322849 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5ce874bb-50b0-4a56-a322-f5590c1d19bd" (UID: "5ce874bb-50b0-4a56-a322-f5590c1d19bd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.333443 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ce874bb-50b0-4a56-a322-f5590c1d19bd-kube-api-access-ppd4m" (OuterVolumeSpecName: "kube-api-access-ppd4m") pod "5ce874bb-50b0-4a56-a322-f5590c1d19bd" (UID: "5ce874bb-50b0-4a56-a322-f5590c1d19bd"). InnerVolumeSpecName "kube-api-access-ppd4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.347234 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5ce874bb-50b0-4a56-a322-f5590c1d19bd" (UID: "5ce874bb-50b0-4a56-a322-f5590c1d19bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.383848 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-config-data" (OuterVolumeSpecName: "config-data") pod "5ce874bb-50b0-4a56-a322-f5590c1d19bd" (UID: "5ce874bb-50b0-4a56-a322-f5590c1d19bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.418223 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.418256 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppd4m\" (UniqueName: \"kubernetes.io/projected/5ce874bb-50b0-4a56-a322-f5590c1d19bd-kube-api-access-ppd4m\") on node \"crc\" DevicePath \"\"" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.418269 4775 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.418277 4775 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5ce874bb-50b0-4a56-a322-f5590c1d19bd-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.943782 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29491921-2bnsm" event={"ID":"5ce874bb-50b0-4a56-a322-f5590c1d19bd","Type":"ContainerDied","Data":"865df426ea7712199666a8f9d8149f7edb5f7fce0de03d0e0d34bbada43b2d98"} Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.943847 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="865df426ea7712199666a8f9d8149f7edb5f7fce0de03d0e0d34bbada43b2d98" Jan 27 12:01:05 crc kubenswrapper[4775]: I0127 12:01:05.943867 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29491921-2bnsm" Jan 27 12:01:06 crc kubenswrapper[4775]: I0127 12:01:06.745434 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:01:06 crc kubenswrapper[4775]: E0127 12:01:06.745738 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:01:19 crc kubenswrapper[4775]: I0127 12:01:19.745821 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:01:19 crc kubenswrapper[4775]: E0127 12:01:19.746832 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:01:33 crc kubenswrapper[4775]: I0127 12:01:33.745816 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:01:33 crc kubenswrapper[4775]: E0127 12:01:33.746969 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:01:46 crc kubenswrapper[4775]: I0127 12:01:46.744598 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:01:46 crc kubenswrapper[4775]: E0127 12:01:46.746387 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:02:00 crc kubenswrapper[4775]: I0127 12:02:00.744951 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:02:00 crc kubenswrapper[4775]: E0127 12:02:00.745852 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:02:11 crc kubenswrapper[4775]: I0127 12:02:11.752330 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:02:11 crc kubenswrapper[4775]: E0127 12:02:11.753369 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:02:25 crc kubenswrapper[4775]: I0127 12:02:25.745519 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:02:25 crc kubenswrapper[4775]: E0127 12:02:25.746780 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:02:38 crc kubenswrapper[4775]: I0127 12:02:38.745710 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:02:38 crc kubenswrapper[4775]: E0127 12:02:38.746419 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:02:50 crc kubenswrapper[4775]: I0127 12:02:50.745421 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:02:50 crc kubenswrapper[4775]: E0127 12:02:50.746296 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:03:01 crc kubenswrapper[4775]: I0127 12:03:01.754535 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:03:02 crc kubenswrapper[4775]: I0127 12:03:02.938973 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"e19be9cb2470676eead68730edfaae0d37d92aa074dd1cc32c6b30d21624c365"} Jan 27 12:03:04 crc kubenswrapper[4775]: I0127 12:03:04.962599 4775 generic.go:334] "Generic (PLEG): container finished" podID="36bee79d-4a97-407b-9907-87d740929ba0" containerID="0d10c4e2f56c82fb2bd1a827f287080bf4533c4f837f84b4f88c1945f15b20ca" exitCode=0 Jan 27 12:03:04 crc kubenswrapper[4775]: I0127 12:03:04.962765 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" event={"ID":"36bee79d-4a97-407b-9907-87d740929ba0","Type":"ContainerDied","Data":"0d10c4e2f56c82fb2bd1a827f287080bf4533c4f837f84b4f88c1945f15b20ca"} Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.371798 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.498988 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/36bee79d-4a97-407b-9907-87d740929ba0-nova-extra-config-0\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499094 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-inventory\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499151 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-combined-ca-bundle\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499250 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-0\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499320 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-1\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499348 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-ssh-key-openstack-edpm-ipam\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499418 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-1\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499466 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-0\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.499551 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz55c\" (UniqueName: \"kubernetes.io/projected/36bee79d-4a97-407b-9907-87d740929ba0-kube-api-access-cz55c\") pod \"36bee79d-4a97-407b-9907-87d740929ba0\" (UID: \"36bee79d-4a97-407b-9907-87d740929ba0\") " Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.507384 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36bee79d-4a97-407b-9907-87d740929ba0-kube-api-access-cz55c" (OuterVolumeSpecName: "kube-api-access-cz55c") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "kube-api-access-cz55c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.507574 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.533706 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.533733 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.534009 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36bee79d-4a97-407b-9907-87d740929ba0-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.537400 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-inventory" (OuterVolumeSpecName: "inventory") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.539923 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.544739 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.552586 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "36bee79d-4a97-407b-9907-87d740929ba0" (UID: "36bee79d-4a97-407b-9907-87d740929ba0"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603583 4775 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603629 4775 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603651 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603663 4775 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603707 4775 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603720 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cz55c\" (UniqueName: \"kubernetes.io/projected/36bee79d-4a97-407b-9907-87d740929ba0-kube-api-access-cz55c\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603735 4775 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/36bee79d-4a97-407b-9907-87d740929ba0-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603751 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.603762 4775 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bee79d-4a97-407b-9907-87d740929ba0-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.981127 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" event={"ID":"36bee79d-4a97-407b-9907-87d740929ba0","Type":"ContainerDied","Data":"c0f0e60fd0308f16eb0ba529574f44937b12975ddccb5ca3ee21176c89d848a1"} Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.981168 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0f0e60fd0308f16eb0ba529574f44937b12975ddccb5ca3ee21176c89d848a1" Jan 27 12:03:06 crc kubenswrapper[4775]: I0127 12:03:06.981239 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-27lb2" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.096912 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd"] Jan 27 12:03:07 crc kubenswrapper[4775]: E0127 12:03:07.097335 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ce874bb-50b0-4a56-a322-f5590c1d19bd" containerName="keystone-cron" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.097361 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ce874bb-50b0-4a56-a322-f5590c1d19bd" containerName="keystone-cron" Jan 27 12:03:07 crc kubenswrapper[4775]: E0127 12:03:07.097404 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36bee79d-4a97-407b-9907-87d740929ba0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.097414 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="36bee79d-4a97-407b-9907-87d740929ba0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.097768 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="36bee79d-4a97-407b-9907-87d740929ba0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.097806 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ce874bb-50b0-4a56-a322-f5590c1d19bd" containerName="keystone-cron" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.098538 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.102069 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.102343 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.102655 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.102886 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.103100 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-lxz4z" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.108875 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd"] Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.133334 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8qrs\" (UniqueName: \"kubernetes.io/projected/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-kube-api-access-d8qrs\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.133385 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.133414 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.133434 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.133488 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.133522 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.133573 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.235333 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8qrs\" (UniqueName: \"kubernetes.io/projected/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-kube-api-access-d8qrs\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.235403 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.235459 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.235492 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.235517 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.235546 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.235575 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.240775 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.240985 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.241000 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.241941 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.242370 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.242790 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.254047 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8qrs\" (UniqueName: \"kubernetes.io/projected/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-kube-api-access-d8qrs\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-trmfd\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:07 crc kubenswrapper[4775]: I0127 12:03:07.438389 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:03:08 crc kubenswrapper[4775]: I0127 12:03:08.026038 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd"] Jan 27 12:03:08 crc kubenswrapper[4775]: I0127 12:03:08.033417 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 12:03:09 crc kubenswrapper[4775]: I0127 12:03:09.002275 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" event={"ID":"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398","Type":"ContainerStarted","Data":"8073976c70e0ee717b2782c46f8e7a8d92d5ccef8cc4787bbe1d623fac532fde"} Jan 27 12:03:09 crc kubenswrapper[4775]: I0127 12:03:09.002762 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" event={"ID":"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398","Type":"ContainerStarted","Data":"e2c9957e6919cf21a73a6a159a6ac6b856a70c65eadfdcf7e4c5a6fc0fb126a6"} Jan 27 12:03:09 crc kubenswrapper[4775]: I0127 12:03:09.025980 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" podStartSLOduration=1.549605085 podStartE2EDuration="2.025961373s" podCreationTimestamp="2026-01-27 12:03:07 +0000 UTC" firstStartedPulling="2026-01-27 12:03:08.033209927 +0000 UTC m=+2567.174807704" lastFinishedPulling="2026-01-27 12:03:08.509566195 +0000 UTC m=+2567.651163992" observedRunningTime="2026-01-27 12:03:09.020525797 +0000 UTC m=+2568.162123584" watchObservedRunningTime="2026-01-27 12:03:09.025961373 +0000 UTC m=+2568.167559150" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.676216 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lzgfp"] Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.695041 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.707894 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lzgfp"] Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.840219 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmdcc\" (UniqueName: \"kubernetes.io/projected/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-kube-api-access-wmdcc\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.840353 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-catalog-content\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.840389 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-utilities\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.942327 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-catalog-content\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.942383 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-utilities\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.942473 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmdcc\" (UniqueName: \"kubernetes.io/projected/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-kube-api-access-wmdcc\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.943154 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-catalog-content\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.943364 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-utilities\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:23 crc kubenswrapper[4775]: I0127 12:03:23.969223 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmdcc\" (UniqueName: \"kubernetes.io/projected/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-kube-api-access-wmdcc\") pod \"community-operators-lzgfp\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:24 crc kubenswrapper[4775]: I0127 12:03:24.025289 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:25 crc kubenswrapper[4775]: W0127 12:03:25.643761 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod452087fc_eb7e_4fb0_9c5b_cc827c8fc32e.slice/crio-d1db8bdfd16475be48f59ac6b91a7f8b7ce421e2632029524e8f3c6b7399db3f WatchSource:0}: Error finding container d1db8bdfd16475be48f59ac6b91a7f8b7ce421e2632029524e8f3c6b7399db3f: Status 404 returned error can't find the container with id d1db8bdfd16475be48f59ac6b91a7f8b7ce421e2632029524e8f3c6b7399db3f Jan 27 12:03:25 crc kubenswrapper[4775]: I0127 12:03:25.645688 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lzgfp"] Jan 27 12:03:26 crc kubenswrapper[4775]: I0127 12:03:26.159324 4775 generic.go:334] "Generic (PLEG): container finished" podID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerID="38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032" exitCode=0 Jan 27 12:03:26 crc kubenswrapper[4775]: I0127 12:03:26.159387 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzgfp" event={"ID":"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e","Type":"ContainerDied","Data":"38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032"} Jan 27 12:03:26 crc kubenswrapper[4775]: I0127 12:03:26.159730 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzgfp" event={"ID":"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e","Type":"ContainerStarted","Data":"d1db8bdfd16475be48f59ac6b91a7f8b7ce421e2632029524e8f3c6b7399db3f"} Jan 27 12:03:27 crc kubenswrapper[4775]: I0127 12:03:27.187727 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzgfp" event={"ID":"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e","Type":"ContainerStarted","Data":"32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9"} Jan 27 12:03:28 crc kubenswrapper[4775]: I0127 12:03:28.200126 4775 generic.go:334] "Generic (PLEG): container finished" podID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerID="32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9" exitCode=0 Jan 27 12:03:28 crc kubenswrapper[4775]: I0127 12:03:28.200179 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzgfp" event={"ID":"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e","Type":"ContainerDied","Data":"32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9"} Jan 27 12:03:29 crc kubenswrapper[4775]: I0127 12:03:29.213611 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzgfp" event={"ID":"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e","Type":"ContainerStarted","Data":"dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd"} Jan 27 12:03:29 crc kubenswrapper[4775]: I0127 12:03:29.245757 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lzgfp" podStartSLOduration=3.825975781 podStartE2EDuration="6.245737595s" podCreationTimestamp="2026-01-27 12:03:23 +0000 UTC" firstStartedPulling="2026-01-27 12:03:26.160988929 +0000 UTC m=+2585.302586706" lastFinishedPulling="2026-01-27 12:03:28.580750743 +0000 UTC m=+2587.722348520" observedRunningTime="2026-01-27 12:03:29.235829519 +0000 UTC m=+2588.377427296" watchObservedRunningTime="2026-01-27 12:03:29.245737595 +0000 UTC m=+2588.387335372" Jan 27 12:03:34 crc kubenswrapper[4775]: I0127 12:03:34.025733 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:34 crc kubenswrapper[4775]: I0127 12:03:34.026247 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:34 crc kubenswrapper[4775]: I0127 12:03:34.069053 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:34 crc kubenswrapper[4775]: I0127 12:03:34.304794 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:34 crc kubenswrapper[4775]: I0127 12:03:34.355313 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lzgfp"] Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.264493 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lzgfp" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="registry-server" containerID="cri-o://dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd" gracePeriod=2 Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.725314 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.893936 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-utilities\") pod \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.894254 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmdcc\" (UniqueName: \"kubernetes.io/projected/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-kube-api-access-wmdcc\") pod \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.894394 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-catalog-content\") pod \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\" (UID: \"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e\") " Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.894957 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-utilities" (OuterVolumeSpecName: "utilities") pod "452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" (UID: "452087fc-eb7e-4fb0-9c5b-cc827c8fc32e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.896389 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.900546 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-kube-api-access-wmdcc" (OuterVolumeSpecName: "kube-api-access-wmdcc") pod "452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" (UID: "452087fc-eb7e-4fb0-9c5b-cc827c8fc32e"). InnerVolumeSpecName "kube-api-access-wmdcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.954131 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" (UID: "452087fc-eb7e-4fb0-9c5b-cc827c8fc32e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.998404 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmdcc\" (UniqueName: \"kubernetes.io/projected/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-kube-api-access-wmdcc\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:36 crc kubenswrapper[4775]: I0127 12:03:36.998622 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.280952 4775 generic.go:334] "Generic (PLEG): container finished" podID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerID="dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd" exitCode=0 Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.281061 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzgfp" event={"ID":"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e","Type":"ContainerDied","Data":"dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd"} Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.281153 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lzgfp" event={"ID":"452087fc-eb7e-4fb0-9c5b-cc827c8fc32e","Type":"ContainerDied","Data":"d1db8bdfd16475be48f59ac6b91a7f8b7ce421e2632029524e8f3c6b7399db3f"} Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.281179 4775 scope.go:117] "RemoveContainer" containerID="dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.281084 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lzgfp" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.306726 4775 scope.go:117] "RemoveContainer" containerID="32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.332226 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lzgfp"] Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.343320 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lzgfp"] Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.356476 4775 scope.go:117] "RemoveContainer" containerID="38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.396743 4775 scope.go:117] "RemoveContainer" containerID="dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd" Jan 27 12:03:37 crc kubenswrapper[4775]: E0127 12:03:37.397354 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd\": container with ID starting with dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd not found: ID does not exist" containerID="dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.397401 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd"} err="failed to get container status \"dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd\": rpc error: code = NotFound desc = could not find container \"dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd\": container with ID starting with dbf2f5f07d30e4bb3e93e9b08fd14cba8011f23e768f73a5635eee25082dafcd not found: ID does not exist" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.397431 4775 scope.go:117] "RemoveContainer" containerID="32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9" Jan 27 12:03:37 crc kubenswrapper[4775]: E0127 12:03:37.397793 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9\": container with ID starting with 32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9 not found: ID does not exist" containerID="32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.397837 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9"} err="failed to get container status \"32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9\": rpc error: code = NotFound desc = could not find container \"32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9\": container with ID starting with 32e50864ea967911281862b30e3c02ac12f6ec6dbd0af03f816ce7098fcd30e9 not found: ID does not exist" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.397864 4775 scope.go:117] "RemoveContainer" containerID="38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032" Jan 27 12:03:37 crc kubenswrapper[4775]: E0127 12:03:37.398241 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032\": container with ID starting with 38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032 not found: ID does not exist" containerID="38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.398308 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032"} err="failed to get container status \"38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032\": rpc error: code = NotFound desc = could not find container \"38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032\": container with ID starting with 38dae90ff63fd2ed56f00cf82acb27e0db1bae8db9a27decd284c1aedf54e032 not found: ID does not exist" Jan 27 12:03:37 crc kubenswrapper[4775]: I0127 12:03:37.755527 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" path="/var/lib/kubelet/pods/452087fc-eb7e-4fb0-9c5b-cc827c8fc32e/volumes" Jan 27 12:05:29 crc kubenswrapper[4775]: I0127 12:05:29.518053 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:05:29 crc kubenswrapper[4775]: I0127 12:05:29.518725 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:05:47 crc kubenswrapper[4775]: I0127 12:05:47.870921 4775 generic.go:334] "Generic (PLEG): container finished" podID="c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" containerID="8073976c70e0ee717b2782c46f8e7a8d92d5ccef8cc4787bbe1d623fac532fde" exitCode=0 Jan 27 12:05:47 crc kubenswrapper[4775]: I0127 12:05:47.871009 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" event={"ID":"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398","Type":"ContainerDied","Data":"8073976c70e0ee717b2782c46f8e7a8d92d5ccef8cc4787bbe1d623fac532fde"} Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.264686 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.385467 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8qrs\" (UniqueName: \"kubernetes.io/projected/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-kube-api-access-d8qrs\") pod \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.385590 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ssh-key-openstack-edpm-ipam\") pod \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.386034 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-1\") pod \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.386099 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-2\") pod \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.386177 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-0\") pod \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.386207 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-telemetry-combined-ca-bundle\") pod \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.386279 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-inventory\") pod \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\" (UID: \"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398\") " Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.395724 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-kube-api-access-d8qrs" (OuterVolumeSpecName: "kube-api-access-d8qrs") pod "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" (UID: "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398"). InnerVolumeSpecName "kube-api-access-d8qrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.397092 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" (UID: "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.420311 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" (UID: "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.422278 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" (UID: "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.423301 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" (UID: "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.424622 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" (UID: "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.431652 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-inventory" (OuterVolumeSpecName: "inventory") pod "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" (UID: "c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.489392 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8qrs\" (UniqueName: \"kubernetes.io/projected/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-kube-api-access-d8qrs\") on node \"crc\" DevicePath \"\"" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.489430 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.489440 4775 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.489525 4775 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.489535 4775 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.489543 4775 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.489575 4775 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398-inventory\") on node \"crc\" DevicePath \"\"" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.891965 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" event={"ID":"c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398","Type":"ContainerDied","Data":"e2c9957e6919cf21a73a6a159a6ac6b856a70c65eadfdcf7e4c5a6fc0fb126a6"} Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.892031 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-trmfd" Jan 27 12:05:49 crc kubenswrapper[4775]: I0127 12:05:49.892046 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2c9957e6919cf21a73a6a159a6ac6b856a70c65eadfdcf7e4c5a6fc0fb126a6" Jan 27 12:05:49 crc kubenswrapper[4775]: E0127 12:05:49.924979 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5bab8d8_2ee4_4499_aa5a_9fe4f21ad398.slice/crio-e2c9957e6919cf21a73a6a159a6ac6b856a70c65eadfdcf7e4c5a6fc0fb126a6\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5bab8d8_2ee4_4499_aa5a_9fe4f21ad398.slice\": RecentStats: unable to find data in memory cache]" Jan 27 12:05:59 crc kubenswrapper[4775]: I0127 12:05:59.518126 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:05:59 crc kubenswrapper[4775]: I0127 12:05:59.518769 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:06:29 crc kubenswrapper[4775]: I0127 12:06:29.517526 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:06:29 crc kubenswrapper[4775]: I0127 12:06:29.518117 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:06:29 crc kubenswrapper[4775]: I0127 12:06:29.518178 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 12:06:29 crc kubenswrapper[4775]: I0127 12:06:29.519062 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e19be9cb2470676eead68730edfaae0d37d92aa074dd1cc32c6b30d21624c365"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 12:06:29 crc kubenswrapper[4775]: I0127 12:06:29.519129 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://e19be9cb2470676eead68730edfaae0d37d92aa074dd1cc32c6b30d21624c365" gracePeriod=600 Jan 27 12:06:30 crc kubenswrapper[4775]: I0127 12:06:30.292652 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="e19be9cb2470676eead68730edfaae0d37d92aa074dd1cc32c6b30d21624c365" exitCode=0 Jan 27 12:06:30 crc kubenswrapper[4775]: I0127 12:06:30.292738 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"e19be9cb2470676eead68730edfaae0d37d92aa074dd1cc32c6b30d21624c365"} Jan 27 12:06:30 crc kubenswrapper[4775]: I0127 12:06:30.293375 4775 scope.go:117] "RemoveContainer" containerID="43175eaf3d4cb4f458a57681359d4462c6739096a0c9252da92846dde506cef5" Jan 27 12:06:36 crc kubenswrapper[4775]: I0127 12:06:36.814104 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="f0fb6dfd-0694-418a-965e-789707762ef7" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Jan 27 12:06:37 crc kubenswrapper[4775]: I0127 12:06:37.390164 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a"} Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.804574 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Jan 27 12:07:01 crc kubenswrapper[4775]: E0127 12:07:01.805525 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="extract-utilities" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.805540 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="extract-utilities" Jan 27 12:07:01 crc kubenswrapper[4775]: E0127 12:07:01.805565 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="registry-server" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.805571 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="registry-server" Jan 27 12:07:01 crc kubenswrapper[4775]: E0127 12:07:01.805584 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="extract-content" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.805590 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="extract-content" Jan 27 12:07:01 crc kubenswrapper[4775]: E0127 12:07:01.805605 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.805612 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.805824 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.805846 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="452087fc-eb7e-4fb0-9c5b-cc827c8fc32e" containerName="registry-server" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.806626 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.812300 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.813222 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.815063 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-m62zz" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.815252 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.819061 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.873282 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-config-data\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.874071 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.874296 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976101 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976162 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976206 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976272 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-config-data\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976303 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976400 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976460 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976538 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spdp7\" (UniqueName: \"kubernetes.io/projected/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-kube-api-access-spdp7\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.976566 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.977213 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.977540 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-config-data\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:01 crc kubenswrapper[4775]: I0127 12:07:01.983175 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078140 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078234 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078323 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spdp7\" (UniqueName: \"kubernetes.io/projected/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-kube-api-access-spdp7\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078343 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078372 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078408 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078650 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.078938 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.080279 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.083066 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.084112 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.096300 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spdp7\" (UniqueName: \"kubernetes.io/projected/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-kube-api-access-spdp7\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.107312 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.137972 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 27 12:07:02 crc kubenswrapper[4775]: I0127 12:07:02.676599 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Jan 27 12:07:03 crc kubenswrapper[4775]: I0127 12:07:03.612197 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4","Type":"ContainerStarted","Data":"00ec443440cadc0301613dc3ce658f1a7eee2be5cf8cf133ac8dca49daa5e2da"} Jan 27 12:07:34 crc kubenswrapper[4775]: E0127 12:07:34.169715 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Jan 27 12:07:34 crc kubenswrapper[4775]: E0127 12:07:34.170535 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-spdp7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 12:07:34 crc kubenswrapper[4775]: E0127 12:07:34.171788 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" Jan 27 12:07:34 crc kubenswrapper[4775]: E0127 12:07:34.934549 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" Jan 27 12:07:48 crc kubenswrapper[4775]: I0127 12:07:48.550875 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Jan 27 12:07:50 crc kubenswrapper[4775]: I0127 12:07:50.072280 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4","Type":"ContainerStarted","Data":"40fc909f0ec8c053ea1716eb162721fd6c0dfff06bce588ecd82e1bf26830748"} Jan 27 12:07:50 crc kubenswrapper[4775]: I0127 12:07:50.095090 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.226045863 podStartE2EDuration="50.095064914s" podCreationTimestamp="2026-01-27 12:07:00 +0000 UTC" firstStartedPulling="2026-01-27 12:07:02.679610082 +0000 UTC m=+2801.821207859" lastFinishedPulling="2026-01-27 12:07:48.548629133 +0000 UTC m=+2847.690226910" observedRunningTime="2026-01-27 12:07:50.093584645 +0000 UTC m=+2849.235182432" watchObservedRunningTime="2026-01-27 12:07:50.095064914 +0000 UTC m=+2849.236662691" Jan 27 12:07:58 crc kubenswrapper[4775]: I0127 12:07:58.144722 4775 generic.go:334] "Generic (PLEG): container finished" podID="ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" containerID="40fc909f0ec8c053ea1716eb162721fd6c0dfff06bce588ecd82e1bf26830748" exitCode=123 Jan 27 12:07:58 crc kubenswrapper[4775]: I0127 12:07:58.144842 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4","Type":"ContainerDied","Data":"40fc909f0ec8c053ea1716eb162721fd6c0dfff06bce588ecd82e1bf26830748"} Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.429604 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-66648b46df-hskmp" podUID="e22ddb6f-e33b-41ea-a24f-c97c0676e6e5" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.528749 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668314 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668614 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668650 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config-secret\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668706 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ca-certs\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668763 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-config-data\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668828 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-temporary\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668854 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ssh-key\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668899 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-workdir\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.668934 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spdp7\" (UniqueName: \"kubernetes.io/projected/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-kube-api-access-spdp7\") pod \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\" (UID: \"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4\") " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.669742 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.670349 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-config-data" (OuterVolumeSpecName: "config-data") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.670477 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.676096 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.676809 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-kube-api-access-spdp7" (OuterVolumeSpecName: "kube-api-access-spdp7") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "kube-api-access-spdp7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.699818 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.700087 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.701324 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.724783 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" (UID: "ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771500 4775 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771536 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771551 4775 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771565 4775 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ca-certs\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771575 4775 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-config-data\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771587 4775 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771598 4775 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-ssh-key\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771609 4775 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.771621 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spdp7\" (UniqueName: \"kubernetes.io/projected/ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4-kube-api-access-spdp7\") on node \"crc\" DevicePath \"\"" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.796007 4775 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 27 12:07:59 crc kubenswrapper[4775]: I0127 12:07:59.873864 4775 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:00 crc kubenswrapper[4775]: I0127 12:08:00.171220 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4","Type":"ContainerDied","Data":"00ec443440cadc0301613dc3ce658f1a7eee2be5cf8cf133ac8dca49daa5e2da"} Jan 27 12:08:00 crc kubenswrapper[4775]: I0127 12:08:00.171651 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00ec443440cadc0301613dc3ce658f1a7eee2be5cf8cf133ac8dca49daa5e2da" Jan 27 12:08:00 crc kubenswrapper[4775]: I0127 12:08:00.171273 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.458077 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 27 12:08:08 crc kubenswrapper[4775]: E0127 12:08:08.459205 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" containerName="tempest-tests-tempest-tests-runner" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.459227 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" containerName="tempest-tests-tempest-tests-runner" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.459519 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4" containerName="tempest-tests-tempest-tests-runner" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.460250 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.463250 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-m62zz" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.469255 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.554631 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9d17d9d1-39f7-417c-b058-cda582c7f7d3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.554761 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb99p\" (UniqueName: \"kubernetes.io/projected/9d17d9d1-39f7-417c-b058-cda582c7f7d3-kube-api-access-tb99p\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9d17d9d1-39f7-417c-b058-cda582c7f7d3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.657084 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb99p\" (UniqueName: \"kubernetes.io/projected/9d17d9d1-39f7-417c-b058-cda582c7f7d3-kube-api-access-tb99p\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9d17d9d1-39f7-417c-b058-cda582c7f7d3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.657860 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9d17d9d1-39f7-417c-b058-cda582c7f7d3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.658366 4775 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9d17d9d1-39f7-417c-b058-cda582c7f7d3\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.682662 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb99p\" (UniqueName: \"kubernetes.io/projected/9d17d9d1-39f7-417c-b058-cda582c7f7d3-kube-api-access-tb99p\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9d17d9d1-39f7-417c-b058-cda582c7f7d3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.686992 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9d17d9d1-39f7-417c-b058-cda582c7f7d3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:08 crc kubenswrapper[4775]: I0127 12:08:08.792194 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Jan 27 12:08:09 crc kubenswrapper[4775]: I0127 12:08:09.266946 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Jan 27 12:08:09 crc kubenswrapper[4775]: W0127 12:08:09.268926 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d17d9d1_39f7_417c_b058_cda582c7f7d3.slice/crio-c41b5854d1aa4d5b8c41bd8210a19bc84d1026b1edd74d1414b01d9996fb08b8 WatchSource:0}: Error finding container c41b5854d1aa4d5b8c41bd8210a19bc84d1026b1edd74d1414b01d9996fb08b8: Status 404 returned error can't find the container with id c41b5854d1aa4d5b8c41bd8210a19bc84d1026b1edd74d1414b01d9996fb08b8 Jan 27 12:08:09 crc kubenswrapper[4775]: I0127 12:08:09.272013 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 12:08:10 crc kubenswrapper[4775]: I0127 12:08:10.260846 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9d17d9d1-39f7-417c-b058-cda582c7f7d3","Type":"ContainerStarted","Data":"c41b5854d1aa4d5b8c41bd8210a19bc84d1026b1edd74d1414b01d9996fb08b8"} Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.271590 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9d17d9d1-39f7-417c-b058-cda582c7f7d3","Type":"ContainerStarted","Data":"ea19d02276e7877ad22f2cd609aa15158b05d96b97422667d1ded2ab38bb1384"} Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.288437 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.16042419 podStartE2EDuration="3.288412766s" podCreationTimestamp="2026-01-27 12:08:08 +0000 UTC" firstStartedPulling="2026-01-27 12:08:09.271731428 +0000 UTC m=+2868.413329205" lastFinishedPulling="2026-01-27 12:08:10.399720004 +0000 UTC m=+2869.541317781" observedRunningTime="2026-01-27 12:08:11.285900559 +0000 UTC m=+2870.427498336" watchObservedRunningTime="2026-01-27 12:08:11.288412766 +0000 UTC m=+2870.430010543" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.641782 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g96dh"] Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.644122 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.658288 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g96dh"] Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.723193 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-catalog-content\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.723386 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ksc6\" (UniqueName: \"kubernetes.io/projected/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-kube-api-access-2ksc6\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.723649 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-utilities\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.827325 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-catalog-content\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.827476 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ksc6\" (UniqueName: \"kubernetes.io/projected/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-kube-api-access-2ksc6\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.827599 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-utilities\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.828157 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-catalog-content\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.828266 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-utilities\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.857614 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ksc6\" (UniqueName: \"kubernetes.io/projected/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-kube-api-access-2ksc6\") pod \"certified-operators-g96dh\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:11 crc kubenswrapper[4775]: I0127 12:08:11.962936 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:12 crc kubenswrapper[4775]: I0127 12:08:12.515613 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g96dh"] Jan 27 12:08:13 crc kubenswrapper[4775]: I0127 12:08:13.292409 4775 generic.go:334] "Generic (PLEG): container finished" podID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerID="0d3daee2a63c141891156804999c85b76b5da3af98a91ac88ceee1184f9daa1b" exitCode=0 Jan 27 12:08:13 crc kubenswrapper[4775]: I0127 12:08:13.292499 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g96dh" event={"ID":"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32","Type":"ContainerDied","Data":"0d3daee2a63c141891156804999c85b76b5da3af98a91ac88ceee1184f9daa1b"} Jan 27 12:08:13 crc kubenswrapper[4775]: I0127 12:08:13.292848 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g96dh" event={"ID":"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32","Type":"ContainerStarted","Data":"49113f27bda5f8b5de5626a113acf85f302b42d3df20c2e33e6424f2d1c022cf"} Jan 27 12:08:15 crc kubenswrapper[4775]: I0127 12:08:15.313315 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g96dh" event={"ID":"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32","Type":"ContainerStarted","Data":"0efe146bb28c9c980ce61511ad6dc594182c0b72d3f190d3957a78c261f5e852"} Jan 27 12:08:16 crc kubenswrapper[4775]: I0127 12:08:16.328573 4775 generic.go:334] "Generic (PLEG): container finished" podID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerID="0efe146bb28c9c980ce61511ad6dc594182c0b72d3f190d3957a78c261f5e852" exitCode=0 Jan 27 12:08:16 crc kubenswrapper[4775]: I0127 12:08:16.328633 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g96dh" event={"ID":"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32","Type":"ContainerDied","Data":"0efe146bb28c9c980ce61511ad6dc594182c0b72d3f190d3957a78c261f5e852"} Jan 27 12:08:18 crc kubenswrapper[4775]: I0127 12:08:18.351918 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g96dh" event={"ID":"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32","Type":"ContainerStarted","Data":"26668656c40dde07369c921dabbcccf1fec82c2945ab9d9233c398f34d167320"} Jan 27 12:08:18 crc kubenswrapper[4775]: I0127 12:08:18.380262 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g96dh" podStartSLOduration=3.278407124 podStartE2EDuration="7.380232258s" podCreationTimestamp="2026-01-27 12:08:11 +0000 UTC" firstStartedPulling="2026-01-27 12:08:13.29780845 +0000 UTC m=+2872.439406227" lastFinishedPulling="2026-01-27 12:08:17.399633584 +0000 UTC m=+2876.541231361" observedRunningTime="2026-01-27 12:08:18.373411766 +0000 UTC m=+2877.515009563" watchObservedRunningTime="2026-01-27 12:08:18.380232258 +0000 UTC m=+2877.521830035" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.246422 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9vzf4"] Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.251656 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.266413 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vzf4"] Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.395950 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-catalog-content\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.396130 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-utilities\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.396174 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zdlj\" (UniqueName: \"kubernetes.io/projected/5be7b52b-2651-4ee2-ab40-fef637a295e9-kube-api-access-9zdlj\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.498589 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-catalog-content\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.498826 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-utilities\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.498875 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zdlj\" (UniqueName: \"kubernetes.io/projected/5be7b52b-2651-4ee2-ab40-fef637a295e9-kube-api-access-9zdlj\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.499575 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-catalog-content\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.499667 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-utilities\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.524033 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zdlj\" (UniqueName: \"kubernetes.io/projected/5be7b52b-2651-4ee2-ab40-fef637a295e9-kube-api-access-9zdlj\") pod \"redhat-marketplace-9vzf4\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:19 crc kubenswrapper[4775]: I0127 12:08:19.585860 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:20 crc kubenswrapper[4775]: I0127 12:08:20.159485 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vzf4"] Jan 27 12:08:20 crc kubenswrapper[4775]: W0127 12:08:20.166762 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5be7b52b_2651_4ee2_ab40_fef637a295e9.slice/crio-368ff5a2426ddc48e238a39b06650b457d235651479833eac083cc173837314b WatchSource:0}: Error finding container 368ff5a2426ddc48e238a39b06650b457d235651479833eac083cc173837314b: Status 404 returned error can't find the container with id 368ff5a2426ddc48e238a39b06650b457d235651479833eac083cc173837314b Jan 27 12:08:20 crc kubenswrapper[4775]: I0127 12:08:20.380868 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vzf4" event={"ID":"5be7b52b-2651-4ee2-ab40-fef637a295e9","Type":"ContainerStarted","Data":"368ff5a2426ddc48e238a39b06650b457d235651479833eac083cc173837314b"} Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.035804 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jw487"] Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.040194 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.081158 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jw487"] Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.142362 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-utilities\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.143537 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s967\" (UniqueName: \"kubernetes.io/projected/435eff0a-268d-44de-921d-217e8067a11d-kube-api-access-7s967\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.143805 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-catalog-content\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.245600 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-catalog-content\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.246352 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-utilities\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.247014 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s967\" (UniqueName: \"kubernetes.io/projected/435eff0a-268d-44de-921d-217e8067a11d-kube-api-access-7s967\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.248207 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-catalog-content\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.248488 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-utilities\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.285481 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s967\" (UniqueName: \"kubernetes.io/projected/435eff0a-268d-44de-921d-217e8067a11d-kube-api-access-7s967\") pod \"redhat-operators-jw487\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.389204 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.398634 4775 generic.go:334] "Generic (PLEG): container finished" podID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerID="ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776" exitCode=0 Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.398710 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vzf4" event={"ID":"5be7b52b-2651-4ee2-ab40-fef637a295e9","Type":"ContainerDied","Data":"ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776"} Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.944101 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jw487"] Jan 27 12:08:21 crc kubenswrapper[4775]: W0127 12:08:21.951051 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod435eff0a_268d_44de_921d_217e8067a11d.slice/crio-5008e14a9b7681092dc9f91d538f9eceeb0a4e02f7d34a2b791d82b87e0f96a9 WatchSource:0}: Error finding container 5008e14a9b7681092dc9f91d538f9eceeb0a4e02f7d34a2b791d82b87e0f96a9: Status 404 returned error can't find the container with id 5008e14a9b7681092dc9f91d538f9eceeb0a4e02f7d34a2b791d82b87e0f96a9 Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.964020 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:21 crc kubenswrapper[4775]: I0127 12:08:21.964087 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:22 crc kubenswrapper[4775]: I0127 12:08:22.030218 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:22 crc kubenswrapper[4775]: I0127 12:08:22.428736 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vzf4" event={"ID":"5be7b52b-2651-4ee2-ab40-fef637a295e9","Type":"ContainerStarted","Data":"556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4"} Jan 27 12:08:22 crc kubenswrapper[4775]: I0127 12:08:22.442341 4775 generic.go:334] "Generic (PLEG): container finished" podID="435eff0a-268d-44de-921d-217e8067a11d" containerID="be239197a6c3fe66ea7572883d434bb853af0124ecafe455b058cffc8a6425f9" exitCode=0 Jan 27 12:08:22 crc kubenswrapper[4775]: I0127 12:08:22.442486 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jw487" event={"ID":"435eff0a-268d-44de-921d-217e8067a11d","Type":"ContainerDied","Data":"be239197a6c3fe66ea7572883d434bb853af0124ecafe455b058cffc8a6425f9"} Jan 27 12:08:22 crc kubenswrapper[4775]: I0127 12:08:22.442545 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jw487" event={"ID":"435eff0a-268d-44de-921d-217e8067a11d","Type":"ContainerStarted","Data":"5008e14a9b7681092dc9f91d538f9eceeb0a4e02f7d34a2b791d82b87e0f96a9"} Jan 27 12:08:22 crc kubenswrapper[4775]: I0127 12:08:22.521415 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:23 crc kubenswrapper[4775]: I0127 12:08:23.456117 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jw487" event={"ID":"435eff0a-268d-44de-921d-217e8067a11d","Type":"ContainerStarted","Data":"8020f3a9978f9caa03e8a67aaf043bed33e7490425d2cdc1f84006a857741cf7"} Jan 27 12:08:23 crc kubenswrapper[4775]: I0127 12:08:23.460737 4775 generic.go:334] "Generic (PLEG): container finished" podID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerID="556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4" exitCode=0 Jan 27 12:08:23 crc kubenswrapper[4775]: I0127 12:08:23.460818 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vzf4" event={"ID":"5be7b52b-2651-4ee2-ab40-fef637a295e9","Type":"ContainerDied","Data":"556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4"} Jan 27 12:08:24 crc kubenswrapper[4775]: I0127 12:08:24.473474 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vzf4" event={"ID":"5be7b52b-2651-4ee2-ab40-fef637a295e9","Type":"ContainerStarted","Data":"5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e"} Jan 27 12:08:24 crc kubenswrapper[4775]: I0127 12:08:24.498862 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9vzf4" podStartSLOduration=2.84972763 podStartE2EDuration="5.498841653s" podCreationTimestamp="2026-01-27 12:08:19 +0000 UTC" firstStartedPulling="2026-01-27 12:08:21.413882951 +0000 UTC m=+2880.555480728" lastFinishedPulling="2026-01-27 12:08:24.062996974 +0000 UTC m=+2883.204594751" observedRunningTime="2026-01-27 12:08:24.494584749 +0000 UTC m=+2883.636182556" watchObservedRunningTime="2026-01-27 12:08:24.498841653 +0000 UTC m=+2883.640439430" Jan 27 12:08:25 crc kubenswrapper[4775]: I0127 12:08:25.606392 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g96dh"] Jan 27 12:08:25 crc kubenswrapper[4775]: I0127 12:08:25.606728 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g96dh" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="registry-server" containerID="cri-o://26668656c40dde07369c921dabbcccf1fec82c2945ab9d9233c398f34d167320" gracePeriod=2 Jan 27 12:08:26 crc kubenswrapper[4775]: I0127 12:08:26.494916 4775 generic.go:334] "Generic (PLEG): container finished" podID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerID="26668656c40dde07369c921dabbcccf1fec82c2945ab9d9233c398f34d167320" exitCode=0 Jan 27 12:08:26 crc kubenswrapper[4775]: I0127 12:08:26.495011 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g96dh" event={"ID":"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32","Type":"ContainerDied","Data":"26668656c40dde07369c921dabbcccf1fec82c2945ab9d9233c398f34d167320"} Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.231511 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.280428 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-utilities\") pod \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.280576 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-catalog-content\") pod \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.280672 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ksc6\" (UniqueName: \"kubernetes.io/projected/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-kube-api-access-2ksc6\") pod \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\" (UID: \"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32\") " Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.281276 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-utilities" (OuterVolumeSpecName: "utilities") pod "f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" (UID: "f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.287826 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-kube-api-access-2ksc6" (OuterVolumeSpecName: "kube-api-access-2ksc6") pod "f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" (UID: "f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32"). InnerVolumeSpecName "kube-api-access-2ksc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.320318 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" (UID: "f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.382547 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.382591 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.382607 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ksc6\" (UniqueName: \"kubernetes.io/projected/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32-kube-api-access-2ksc6\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.506424 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g96dh" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.506483 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g96dh" event={"ID":"f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32","Type":"ContainerDied","Data":"49113f27bda5f8b5de5626a113acf85f302b42d3df20c2e33e6424f2d1c022cf"} Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.506651 4775 scope.go:117] "RemoveContainer" containerID="26668656c40dde07369c921dabbcccf1fec82c2945ab9d9233c398f34d167320" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.525488 4775 scope.go:117] "RemoveContainer" containerID="0efe146bb28c9c980ce61511ad6dc594182c0b72d3f190d3957a78c261f5e852" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.540910 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g96dh"] Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.549560 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g96dh"] Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.568726 4775 scope.go:117] "RemoveContainer" containerID="0d3daee2a63c141891156804999c85b76b5da3af98a91ac88ceee1184f9daa1b" Jan 27 12:08:27 crc kubenswrapper[4775]: I0127 12:08:27.755816 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" path="/var/lib/kubelet/pods/f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32/volumes" Jan 27 12:08:29 crc kubenswrapper[4775]: I0127 12:08:29.586746 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:29 crc kubenswrapper[4775]: I0127 12:08:29.586818 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:29 crc kubenswrapper[4775]: I0127 12:08:29.646273 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:30 crc kubenswrapper[4775]: I0127 12:08:30.586762 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:31 crc kubenswrapper[4775]: I0127 12:08:31.004661 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vzf4"] Jan 27 12:08:32 crc kubenswrapper[4775]: I0127 12:08:32.562919 4775 generic.go:334] "Generic (PLEG): container finished" podID="435eff0a-268d-44de-921d-217e8067a11d" containerID="8020f3a9978f9caa03e8a67aaf043bed33e7490425d2cdc1f84006a857741cf7" exitCode=0 Jan 27 12:08:32 crc kubenswrapper[4775]: I0127 12:08:32.562992 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jw487" event={"ID":"435eff0a-268d-44de-921d-217e8067a11d","Type":"ContainerDied","Data":"8020f3a9978f9caa03e8a67aaf043bed33e7490425d2cdc1f84006a857741cf7"} Jan 27 12:08:32 crc kubenswrapper[4775]: I0127 12:08:32.563593 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9vzf4" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="registry-server" containerID="cri-o://5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e" gracePeriod=2 Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.543987 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.591883 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jw487" event={"ID":"435eff0a-268d-44de-921d-217e8067a11d","Type":"ContainerStarted","Data":"c21bc7652b9b6318bd9629d52a876a0ce691d0f0a26b3f6213f89b2b56b254ab"} Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.596480 4775 generic.go:334] "Generic (PLEG): container finished" podID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerID="5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e" exitCode=0 Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.596547 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vzf4" event={"ID":"5be7b52b-2651-4ee2-ab40-fef637a295e9","Type":"ContainerDied","Data":"5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e"} Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.596592 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9vzf4" event={"ID":"5be7b52b-2651-4ee2-ab40-fef637a295e9","Type":"ContainerDied","Data":"368ff5a2426ddc48e238a39b06650b457d235651479833eac083cc173837314b"} Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.596617 4775 scope.go:117] "RemoveContainer" containerID="5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.596642 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9vzf4" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.616240 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-utilities\") pod \"5be7b52b-2651-4ee2-ab40-fef637a295e9\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.616387 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-catalog-content\") pod \"5be7b52b-2651-4ee2-ab40-fef637a295e9\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.616428 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zdlj\" (UniqueName: \"kubernetes.io/projected/5be7b52b-2651-4ee2-ab40-fef637a295e9-kube-api-access-9zdlj\") pod \"5be7b52b-2651-4ee2-ab40-fef637a295e9\" (UID: \"5be7b52b-2651-4ee2-ab40-fef637a295e9\") " Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.618957 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-utilities" (OuterVolumeSpecName: "utilities") pod "5be7b52b-2651-4ee2-ab40-fef637a295e9" (UID: "5be7b52b-2651-4ee2-ab40-fef637a295e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.625607 4775 scope.go:117] "RemoveContainer" containerID="556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.638052 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5be7b52b-2651-4ee2-ab40-fef637a295e9-kube-api-access-9zdlj" (OuterVolumeSpecName: "kube-api-access-9zdlj") pod "5be7b52b-2651-4ee2-ab40-fef637a295e9" (UID: "5be7b52b-2651-4ee2-ab40-fef637a295e9"). InnerVolumeSpecName "kube-api-access-9zdlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.640818 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5be7b52b-2651-4ee2-ab40-fef637a295e9" (UID: "5be7b52b-2651-4ee2-ab40-fef637a295e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.654893 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jw487" podStartSLOduration=3.111946289 podStartE2EDuration="13.654863921s" podCreationTimestamp="2026-01-27 12:08:20 +0000 UTC" firstStartedPulling="2026-01-27 12:08:22.447104369 +0000 UTC m=+2881.588702146" lastFinishedPulling="2026-01-27 12:08:32.990022001 +0000 UTC m=+2892.131619778" observedRunningTime="2026-01-27 12:08:33.627925132 +0000 UTC m=+2892.769522909" watchObservedRunningTime="2026-01-27 12:08:33.654863921 +0000 UTC m=+2892.796461698" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.700690 4775 scope.go:117] "RemoveContainer" containerID="ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.719007 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.719050 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zdlj\" (UniqueName: \"kubernetes.io/projected/5be7b52b-2651-4ee2-ab40-fef637a295e9-kube-api-access-9zdlj\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.719071 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5be7b52b-2651-4ee2-ab40-fef637a295e9-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.751253 4775 scope.go:117] "RemoveContainer" containerID="5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e" Jan 27 12:08:33 crc kubenswrapper[4775]: E0127 12:08:33.752157 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e\": container with ID starting with 5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e not found: ID does not exist" containerID="5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.752191 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e"} err="failed to get container status \"5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e\": rpc error: code = NotFound desc = could not find container \"5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e\": container with ID starting with 5a788ab8a80ac51c864158708cdc27b0a354506f91b25e662484ca6a25df747e not found: ID does not exist" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.752209 4775 scope.go:117] "RemoveContainer" containerID="556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4" Jan 27 12:08:33 crc kubenswrapper[4775]: E0127 12:08:33.752563 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4\": container with ID starting with 556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4 not found: ID does not exist" containerID="556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.752587 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4"} err="failed to get container status \"556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4\": rpc error: code = NotFound desc = could not find container \"556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4\": container with ID starting with 556a9f660fc00e8ec90105f1e0f551e6353fab9e31abe918fcf61b534af7a8a4 not found: ID does not exist" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.752600 4775 scope.go:117] "RemoveContainer" containerID="ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776" Jan 27 12:08:33 crc kubenswrapper[4775]: E0127 12:08:33.753003 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776\": container with ID starting with ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776 not found: ID does not exist" containerID="ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.753025 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776"} err="failed to get container status \"ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776\": rpc error: code = NotFound desc = could not find container \"ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776\": container with ID starting with ae0d9730f22ab55c8844488dba8ac76b77043a98e96baa594a5105942b946776 not found: ID does not exist" Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.930917 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vzf4"] Jan 27 12:08:33 crc kubenswrapper[4775]: I0127 12:08:33.941932 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9vzf4"] Jan 27 12:08:35 crc kubenswrapper[4775]: I0127 12:08:35.755208 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" path="/var/lib/kubelet/pods/5be7b52b-2651-4ee2-ab40-fef637a295e9/volumes" Jan 27 12:08:41 crc kubenswrapper[4775]: I0127 12:08:41.390355 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:41 crc kubenswrapper[4775]: I0127 12:08:41.390994 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:42 crc kubenswrapper[4775]: I0127 12:08:42.439185 4775 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jw487" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="registry-server" probeResult="failure" output=< Jan 27 12:08:42 crc kubenswrapper[4775]: timeout: failed to connect service ":50051" within 1s Jan 27 12:08:42 crc kubenswrapper[4775]: > Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.156968 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7p92f/must-gather-wqwn4"] Jan 27 12:08:48 crc kubenswrapper[4775]: E0127 12:08:48.158159 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="extract-utilities" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158176 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="extract-utilities" Jan 27 12:08:48 crc kubenswrapper[4775]: E0127 12:08:48.158199 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="registry-server" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158207 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="registry-server" Jan 27 12:08:48 crc kubenswrapper[4775]: E0127 12:08:48.158218 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="extract-utilities" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158228 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="extract-utilities" Jan 27 12:08:48 crc kubenswrapper[4775]: E0127 12:08:48.158245 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="registry-server" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158253 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="registry-server" Jan 27 12:08:48 crc kubenswrapper[4775]: E0127 12:08:48.158266 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="extract-content" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158275 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="extract-content" Jan 27 12:08:48 crc kubenswrapper[4775]: E0127 12:08:48.158310 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="extract-content" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158317 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="extract-content" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158576 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="f84bb1e0-e9db-4d8d-87c6-ca2e6d3fee32" containerName="registry-server" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.158593 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be7b52b-2651-4ee2-ab40-fef637a295e9" containerName="registry-server" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.159870 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.162883 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7p92f"/"default-dockercfg-76fkb" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.163095 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7p92f"/"kube-root-ca.crt" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.163130 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7p92f"/"openshift-service-ca.crt" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.169815 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7p92f/must-gather-wqwn4"] Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.313912 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09caf0cd-6a8c-41d8-84a7-7813e19a373a-must-gather-output\") pod \"must-gather-wqwn4\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.314021 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84zh6\" (UniqueName: \"kubernetes.io/projected/09caf0cd-6a8c-41d8-84a7-7813e19a373a-kube-api-access-84zh6\") pod \"must-gather-wqwn4\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.416266 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09caf0cd-6a8c-41d8-84a7-7813e19a373a-must-gather-output\") pod \"must-gather-wqwn4\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.416370 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84zh6\" (UniqueName: \"kubernetes.io/projected/09caf0cd-6a8c-41d8-84a7-7813e19a373a-kube-api-access-84zh6\") pod \"must-gather-wqwn4\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.417170 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09caf0cd-6a8c-41d8-84a7-7813e19a373a-must-gather-output\") pod \"must-gather-wqwn4\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.443246 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84zh6\" (UniqueName: \"kubernetes.io/projected/09caf0cd-6a8c-41d8-84a7-7813e19a373a-kube-api-access-84zh6\") pod \"must-gather-wqwn4\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.485130 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:08:48 crc kubenswrapper[4775]: I0127 12:08:48.979715 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7p92f/must-gather-wqwn4"] Jan 27 12:08:48 crc kubenswrapper[4775]: W0127 12:08:48.987441 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09caf0cd_6a8c_41d8_84a7_7813e19a373a.slice/crio-70cd7af253185801588842227f9d28568da5323c702ce7692611090ce847515b WatchSource:0}: Error finding container 70cd7af253185801588842227f9d28568da5323c702ce7692611090ce847515b: Status 404 returned error can't find the container with id 70cd7af253185801588842227f9d28568da5323c702ce7692611090ce847515b Jan 27 12:08:49 crc kubenswrapper[4775]: I0127 12:08:49.764867 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/must-gather-wqwn4" event={"ID":"09caf0cd-6a8c-41d8-84a7-7813e19a373a","Type":"ContainerStarted","Data":"70cd7af253185801588842227f9d28568da5323c702ce7692611090ce847515b"} Jan 27 12:08:51 crc kubenswrapper[4775]: I0127 12:08:51.461788 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:51 crc kubenswrapper[4775]: I0127 12:08:51.519951 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:52 crc kubenswrapper[4775]: I0127 12:08:52.224268 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jw487"] Jan 27 12:08:52 crc kubenswrapper[4775]: I0127 12:08:52.811772 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jw487" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="registry-server" containerID="cri-o://c21bc7652b9b6318bd9629d52a876a0ce691d0f0a26b3f6213f89b2b56b254ab" gracePeriod=2 Jan 27 12:08:53 crc kubenswrapper[4775]: I0127 12:08:53.823809 4775 generic.go:334] "Generic (PLEG): container finished" podID="435eff0a-268d-44de-921d-217e8067a11d" containerID="c21bc7652b9b6318bd9629d52a876a0ce691d0f0a26b3f6213f89b2b56b254ab" exitCode=0 Jan 27 12:08:53 crc kubenswrapper[4775]: I0127 12:08:53.824097 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jw487" event={"ID":"435eff0a-268d-44de-921d-217e8067a11d","Type":"ContainerDied","Data":"c21bc7652b9b6318bd9629d52a876a0ce691d0f0a26b3f6213f89b2b56b254ab"} Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.326440 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.452194 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-catalog-content\") pod \"435eff0a-268d-44de-921d-217e8067a11d\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.452755 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s967\" (UniqueName: \"kubernetes.io/projected/435eff0a-268d-44de-921d-217e8067a11d-kube-api-access-7s967\") pod \"435eff0a-268d-44de-921d-217e8067a11d\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.452880 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-utilities\") pod \"435eff0a-268d-44de-921d-217e8067a11d\" (UID: \"435eff0a-268d-44de-921d-217e8067a11d\") " Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.453778 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-utilities" (OuterVolumeSpecName: "utilities") pod "435eff0a-268d-44de-921d-217e8067a11d" (UID: "435eff0a-268d-44de-921d-217e8067a11d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.461993 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/435eff0a-268d-44de-921d-217e8067a11d-kube-api-access-7s967" (OuterVolumeSpecName: "kube-api-access-7s967") pod "435eff0a-268d-44de-921d-217e8067a11d" (UID: "435eff0a-268d-44de-921d-217e8067a11d"). InnerVolumeSpecName "kube-api-access-7s967". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.555780 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s967\" (UniqueName: \"kubernetes.io/projected/435eff0a-268d-44de-921d-217e8067a11d-kube-api-access-7s967\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.555846 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.593154 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "435eff0a-268d-44de-921d-217e8067a11d" (UID: "435eff0a-268d-44de-921d-217e8067a11d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.657754 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/435eff0a-268d-44de-921d-217e8067a11d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.854050 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/must-gather-wqwn4" event={"ID":"09caf0cd-6a8c-41d8-84a7-7813e19a373a","Type":"ContainerStarted","Data":"d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3"} Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.854138 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/must-gather-wqwn4" event={"ID":"09caf0cd-6a8c-41d8-84a7-7813e19a373a","Type":"ContainerStarted","Data":"14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b"} Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.857733 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jw487" event={"ID":"435eff0a-268d-44de-921d-217e8067a11d","Type":"ContainerDied","Data":"5008e14a9b7681092dc9f91d538f9eceeb0a4e02f7d34a2b791d82b87e0f96a9"} Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.857784 4775 scope.go:117] "RemoveContainer" containerID="c21bc7652b9b6318bd9629d52a876a0ce691d0f0a26b3f6213f89b2b56b254ab" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.857815 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jw487" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.879409 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7p92f/must-gather-wqwn4" podStartSLOduration=1.8302647109999999 podStartE2EDuration="8.879388406s" podCreationTimestamp="2026-01-27 12:08:48 +0000 UTC" firstStartedPulling="2026-01-27 12:08:48.990277434 +0000 UTC m=+2908.131875211" lastFinishedPulling="2026-01-27 12:08:56.039401129 +0000 UTC m=+2915.180998906" observedRunningTime="2026-01-27 12:08:56.873189682 +0000 UTC m=+2916.014787459" watchObservedRunningTime="2026-01-27 12:08:56.879388406 +0000 UTC m=+2916.020986183" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.896491 4775 scope.go:117] "RemoveContainer" containerID="8020f3a9978f9caa03e8a67aaf043bed33e7490425d2cdc1f84006a857741cf7" Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.909497 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jw487"] Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.921945 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jw487"] Jan 27 12:08:56 crc kubenswrapper[4775]: I0127 12:08:56.938503 4775 scope.go:117] "RemoveContainer" containerID="be239197a6c3fe66ea7572883d434bb853af0124ecafe455b058cffc8a6425f9" Jan 27 12:08:57 crc kubenswrapper[4775]: I0127 12:08:57.755416 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="435eff0a-268d-44de-921d-217e8067a11d" path="/var/lib/kubelet/pods/435eff0a-268d-44de-921d-217e8067a11d/volumes" Jan 27 12:08:59 crc kubenswrapper[4775]: I0127 12:08:59.518413 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:08:59 crc kubenswrapper[4775]: I0127 12:08:59.518825 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.475393 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7p92f/crc-debug-fnhcr"] Jan 27 12:09:00 crc kubenswrapper[4775]: E0127 12:09:00.476126 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="extract-utilities" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.476146 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="extract-utilities" Jan 27 12:09:00 crc kubenswrapper[4775]: E0127 12:09:00.476177 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="extract-content" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.476184 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="extract-content" Jan 27 12:09:00 crc kubenswrapper[4775]: E0127 12:09:00.476202 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="registry-server" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.476210 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="registry-server" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.476429 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="435eff0a-268d-44de-921d-217e8067a11d" containerName="registry-server" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.477157 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.640217 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrssq\" (UniqueName: \"kubernetes.io/projected/9c262d80-3666-411e-9947-d5ab93033fa7-kube-api-access-wrssq\") pod \"crc-debug-fnhcr\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.640854 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c262d80-3666-411e-9947-d5ab93033fa7-host\") pod \"crc-debug-fnhcr\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.742838 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrssq\" (UniqueName: \"kubernetes.io/projected/9c262d80-3666-411e-9947-d5ab93033fa7-kube-api-access-wrssq\") pod \"crc-debug-fnhcr\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.742913 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c262d80-3666-411e-9947-d5ab93033fa7-host\") pod \"crc-debug-fnhcr\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.743118 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c262d80-3666-411e-9947-d5ab93033fa7-host\") pod \"crc-debug-fnhcr\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.769380 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrssq\" (UniqueName: \"kubernetes.io/projected/9c262d80-3666-411e-9947-d5ab93033fa7-kube-api-access-wrssq\") pod \"crc-debug-fnhcr\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.798109 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:09:00 crc kubenswrapper[4775]: W0127 12:09:00.830604 4775 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c262d80_3666_411e_9947_d5ab93033fa7.slice/crio-14669a2847dcc22b4f70d5cc48c3bdb539db5dca9938911ad0a69f4165258b10 WatchSource:0}: Error finding container 14669a2847dcc22b4f70d5cc48c3bdb539db5dca9938911ad0a69f4165258b10: Status 404 returned error can't find the container with id 14669a2847dcc22b4f70d5cc48c3bdb539db5dca9938911ad0a69f4165258b10 Jan 27 12:09:00 crc kubenswrapper[4775]: I0127 12:09:00.901550 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" event={"ID":"9c262d80-3666-411e-9947-d5ab93033fa7","Type":"ContainerStarted","Data":"14669a2847dcc22b4f70d5cc48c3bdb539db5dca9938911ad0a69f4165258b10"} Jan 27 12:09:02 crc kubenswrapper[4775]: E0127 12:09:02.328610 4775 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.22:35796->38.102.83.22:36975: read tcp 38.102.83.22:35796->38.102.83.22:36975: read: connection reset by peer Jan 27 12:09:16 crc kubenswrapper[4775]: E0127 12:09:16.653989 4775 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Jan 27 12:09:16 crc kubenswrapper[4775]: E0127 12:09:16.654898 4775 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wrssq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-fnhcr_openshift-must-gather-7p92f(9c262d80-3666-411e-9947-d5ab93033fa7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 27 12:09:16 crc kubenswrapper[4775]: E0127 12:09:16.656162 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" podUID="9c262d80-3666-411e-9947-d5ab93033fa7" Jan 27 12:09:17 crc kubenswrapper[4775]: E0127 12:09:17.084518 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" podUID="9c262d80-3666-411e-9947-d5ab93033fa7" Jan 27 12:09:29 crc kubenswrapper[4775]: I0127 12:09:29.518001 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:09:29 crc kubenswrapper[4775]: I0127 12:09:29.518696 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:09:37 crc kubenswrapper[4775]: I0127 12:09:37.287490 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" event={"ID":"9c262d80-3666-411e-9947-d5ab93033fa7","Type":"ContainerStarted","Data":"baa01a4c6fe93fc697e5252cef256367e24ac68983a3bf4c9c9429de1629fe05"} Jan 27 12:09:37 crc kubenswrapper[4775]: I0127 12:09:37.303258 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" podStartSLOduration=1.6889406710000001 podStartE2EDuration="37.303236822s" podCreationTimestamp="2026-01-27 12:09:00 +0000 UTC" firstStartedPulling="2026-01-27 12:09:00.833273022 +0000 UTC m=+2919.974870799" lastFinishedPulling="2026-01-27 12:09:36.447569173 +0000 UTC m=+2955.589166950" observedRunningTime="2026-01-27 12:09:37.300341891 +0000 UTC m=+2956.441939668" watchObservedRunningTime="2026-01-27 12:09:37.303236822 +0000 UTC m=+2956.444834589" Jan 27 12:09:59 crc kubenswrapper[4775]: I0127 12:09:59.518107 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:09:59 crc kubenswrapper[4775]: I0127 12:09:59.518753 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:09:59 crc kubenswrapper[4775]: I0127 12:09:59.518817 4775 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" Jan 27 12:09:59 crc kubenswrapper[4775]: I0127 12:09:59.519852 4775 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a"} pod="openshift-machine-config-operator/machine-config-daemon-qn99x" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 12:09:59 crc kubenswrapper[4775]: I0127 12:09:59.519916 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" containerID="cri-o://e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" gracePeriod=600 Jan 27 12:09:59 crc kubenswrapper[4775]: E0127 12:09:59.650753 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:10:00 crc kubenswrapper[4775]: I0127 12:10:00.527517 4775 generic.go:334] "Generic (PLEG): container finished" podID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" exitCode=0 Jan 27 12:10:00 crc kubenswrapper[4775]: I0127 12:10:00.527824 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerDied","Data":"e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a"} Jan 27 12:10:00 crc kubenswrapper[4775]: I0127 12:10:00.527864 4775 scope.go:117] "RemoveContainer" containerID="e19be9cb2470676eead68730edfaae0d37d92aa074dd1cc32c6b30d21624c365" Jan 27 12:10:00 crc kubenswrapper[4775]: I0127 12:10:00.528566 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:10:00 crc kubenswrapper[4775]: E0127 12:10:00.528811 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:10:06 crc kubenswrapper[4775]: I0127 12:10:06.583986 4775 generic.go:334] "Generic (PLEG): container finished" podID="9c262d80-3666-411e-9947-d5ab93033fa7" containerID="baa01a4c6fe93fc697e5252cef256367e24ac68983a3bf4c9c9429de1629fe05" exitCode=0 Jan 27 12:10:06 crc kubenswrapper[4775]: I0127 12:10:06.584077 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" event={"ID":"9c262d80-3666-411e-9947-d5ab93033fa7","Type":"ContainerDied","Data":"baa01a4c6fe93fc697e5252cef256367e24ac68983a3bf4c9c9429de1629fe05"} Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.708678 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.760359 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7p92f/crc-debug-fnhcr"] Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.760408 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7p92f/crc-debug-fnhcr"] Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.869991 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrssq\" (UniqueName: \"kubernetes.io/projected/9c262d80-3666-411e-9947-d5ab93033fa7-kube-api-access-wrssq\") pod \"9c262d80-3666-411e-9947-d5ab93033fa7\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.870084 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c262d80-3666-411e-9947-d5ab93033fa7-host\") pod \"9c262d80-3666-411e-9947-d5ab93033fa7\" (UID: \"9c262d80-3666-411e-9947-d5ab93033fa7\") " Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.870295 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c262d80-3666-411e-9947-d5ab93033fa7-host" (OuterVolumeSpecName: "host") pod "9c262d80-3666-411e-9947-d5ab93033fa7" (UID: "9c262d80-3666-411e-9947-d5ab93033fa7"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.870830 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9c262d80-3666-411e-9947-d5ab93033fa7-host\") on node \"crc\" DevicePath \"\"" Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.876483 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c262d80-3666-411e-9947-d5ab93033fa7-kube-api-access-wrssq" (OuterVolumeSpecName: "kube-api-access-wrssq") pod "9c262d80-3666-411e-9947-d5ab93033fa7" (UID: "9c262d80-3666-411e-9947-d5ab93033fa7"). InnerVolumeSpecName "kube-api-access-wrssq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:10:07 crc kubenswrapper[4775]: I0127 12:10:07.972619 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrssq\" (UniqueName: \"kubernetes.io/projected/9c262d80-3666-411e-9947-d5ab93033fa7-kube-api-access-wrssq\") on node \"crc\" DevicePath \"\"" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.603922 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14669a2847dcc22b4f70d5cc48c3bdb539db5dca9938911ad0a69f4165258b10" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.603967 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-fnhcr" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.902116 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7p92f/crc-debug-6pwdd"] Jan 27 12:10:08 crc kubenswrapper[4775]: E0127 12:10:08.902502 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c262d80-3666-411e-9947-d5ab93033fa7" containerName="container-00" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.902514 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c262d80-3666-411e-9947-d5ab93033fa7" containerName="container-00" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.902689 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c262d80-3666-411e-9947-d5ab93033fa7" containerName="container-00" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.903291 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.991194 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfrrd\" (UniqueName: \"kubernetes.io/projected/68e8da4d-550a-40eb-b851-4e7f2b637352-kube-api-access-gfrrd\") pod \"crc-debug-6pwdd\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:08 crc kubenswrapper[4775]: I0127 12:10:08.991276 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68e8da4d-550a-40eb-b851-4e7f2b637352-host\") pod \"crc-debug-6pwdd\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.093540 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfrrd\" (UniqueName: \"kubernetes.io/projected/68e8da4d-550a-40eb-b851-4e7f2b637352-kube-api-access-gfrrd\") pod \"crc-debug-6pwdd\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.093658 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68e8da4d-550a-40eb-b851-4e7f2b637352-host\") pod \"crc-debug-6pwdd\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.093753 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68e8da4d-550a-40eb-b851-4e7f2b637352-host\") pod \"crc-debug-6pwdd\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.114421 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfrrd\" (UniqueName: \"kubernetes.io/projected/68e8da4d-550a-40eb-b851-4e7f2b637352-kube-api-access-gfrrd\") pod \"crc-debug-6pwdd\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.224283 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.613230 4775 generic.go:334] "Generic (PLEG): container finished" podID="68e8da4d-550a-40eb-b851-4e7f2b637352" containerID="5ad2805fe3e9e0db329c40394639e7fe126a36fdd4ced2486cd1e45b12b8c1a1" exitCode=1 Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.613296 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/crc-debug-6pwdd" event={"ID":"68e8da4d-550a-40eb-b851-4e7f2b637352","Type":"ContainerDied","Data":"5ad2805fe3e9e0db329c40394639e7fe126a36fdd4ced2486cd1e45b12b8c1a1"} Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.613535 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/crc-debug-6pwdd" event={"ID":"68e8da4d-550a-40eb-b851-4e7f2b637352","Type":"ContainerStarted","Data":"39e44386371866f95623ce009248c5babf2f1cd2da65943d9bb0221095e0c22c"} Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.653667 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7p92f/crc-debug-6pwdd"] Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.663731 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7p92f/crc-debug-6pwdd"] Jan 27 12:10:09 crc kubenswrapper[4775]: I0127 12:10:09.756028 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c262d80-3666-411e-9947-d5ab93033fa7" path="/var/lib/kubelet/pods/9c262d80-3666-411e-9947-d5ab93033fa7/volumes" Jan 27 12:10:10 crc kubenswrapper[4775]: I0127 12:10:10.727922 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:10 crc kubenswrapper[4775]: I0127 12:10:10.830408 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68e8da4d-550a-40eb-b851-4e7f2b637352-host\") pod \"68e8da4d-550a-40eb-b851-4e7f2b637352\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " Jan 27 12:10:10 crc kubenswrapper[4775]: I0127 12:10:10.830624 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfrrd\" (UniqueName: \"kubernetes.io/projected/68e8da4d-550a-40eb-b851-4e7f2b637352-kube-api-access-gfrrd\") pod \"68e8da4d-550a-40eb-b851-4e7f2b637352\" (UID: \"68e8da4d-550a-40eb-b851-4e7f2b637352\") " Jan 27 12:10:10 crc kubenswrapper[4775]: I0127 12:10:10.830788 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68e8da4d-550a-40eb-b851-4e7f2b637352-host" (OuterVolumeSpecName: "host") pod "68e8da4d-550a-40eb-b851-4e7f2b637352" (UID: "68e8da4d-550a-40eb-b851-4e7f2b637352"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 12:10:10 crc kubenswrapper[4775]: I0127 12:10:10.831178 4775 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/68e8da4d-550a-40eb-b851-4e7f2b637352-host\") on node \"crc\" DevicePath \"\"" Jan 27 12:10:10 crc kubenswrapper[4775]: I0127 12:10:10.836676 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68e8da4d-550a-40eb-b851-4e7f2b637352-kube-api-access-gfrrd" (OuterVolumeSpecName: "kube-api-access-gfrrd") pod "68e8da4d-550a-40eb-b851-4e7f2b637352" (UID: "68e8da4d-550a-40eb-b851-4e7f2b637352"). InnerVolumeSpecName "kube-api-access-gfrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:10:10 crc kubenswrapper[4775]: I0127 12:10:10.933035 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfrrd\" (UniqueName: \"kubernetes.io/projected/68e8da4d-550a-40eb-b851-4e7f2b637352-kube-api-access-gfrrd\") on node \"crc\" DevicePath \"\"" Jan 27 12:10:11 crc kubenswrapper[4775]: I0127 12:10:11.630948 4775 scope.go:117] "RemoveContainer" containerID="5ad2805fe3e9e0db329c40394639e7fe126a36fdd4ced2486cd1e45b12b8c1a1" Jan 27 12:10:11 crc kubenswrapper[4775]: I0127 12:10:11.630995 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/crc-debug-6pwdd" Jan 27 12:10:11 crc kubenswrapper[4775]: I0127 12:10:11.755814 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68e8da4d-550a-40eb-b851-4e7f2b637352" path="/var/lib/kubelet/pods/68e8da4d-550a-40eb-b851-4e7f2b637352/volumes" Jan 27 12:10:12 crc kubenswrapper[4775]: I0127 12:10:12.744702 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:10:12 crc kubenswrapper[4775]: E0127 12:10:12.745096 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:10:25 crc kubenswrapper[4775]: I0127 12:10:25.746049 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:10:25 crc kubenswrapper[4775]: E0127 12:10:25.748871 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:10:26 crc kubenswrapper[4775]: I0127 12:10:26.875006 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d66b74d76-ngwn9_8fa6c814-723c-4638-8ae9-dbb9f6864120/barbican-api/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.064470 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d66b74d76-ngwn9_8fa6c814-723c-4638-8ae9-dbb9f6864120/barbican-api-log/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.158620 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78f66698d-fbfmx_1138f75c-8e56-4a32-8110-8b26d9f80688/barbican-keystone-listener-log/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.183482 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78f66698d-fbfmx_1138f75c-8e56-4a32-8110-8b26d9f80688/barbican-keystone-listener/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.354561 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5bd6cd4f4f-kxhrc_8874fbc9-9d42-45dd-b38b-9ba1a33340f5/barbican-worker-log/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.383178 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5bd6cd4f4f-kxhrc_8874fbc9-9d42-45dd-b38b-9ba1a33340f5/barbican-worker/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.558771 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-f4jgw_ae038f0b-5bbc-48dd-bbef-c5367dcbbfa4/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.637782 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f0fb6dfd-0694-418a-965e-789707762ef7/ceilometer-central-agent/1.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.683604 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f0fb6dfd-0694-418a-965e-789707762ef7/ceilometer-central-agent/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.760264 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f0fb6dfd-0694-418a-965e-789707762ef7/ceilometer-notification-agent/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.798266 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f0fb6dfd-0694-418a-965e-789707762ef7/ceilometer-notification-agent/1.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.859220 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f0fb6dfd-0694-418a-965e-789707762ef7/proxy-httpd/0.log" Jan 27 12:10:27 crc kubenswrapper[4775]: I0127 12:10:27.889496 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f0fb6dfd-0694-418a-965e-789707762ef7/sg-core/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.045497 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3d670312-cbe8-44de-8f6f-857772d2af05/cinder-api/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.062201 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_3d670312-cbe8-44de-8f6f-857772d2af05/cinder-api-log/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.262746 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_030ef7f1-5f79-42e9-800e-55c4f70964e5/cinder-scheduler/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.284597 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_030ef7f1-5f79-42e9-800e-55c4f70964e5/probe/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.445989 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-spnbk_d688b7ee-365a-441b-a0ab-3d1cf6663988/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.545967 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-c4mxb_a42aa0c8-32e3-4fc7-98dd-1c583b67ecc3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.660871 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-knrgp_f6c54a70-a562-4fef-b3fe-14e2a3029229/init/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.807392 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-knrgp_f6c54a70-a562-4fef-b3fe-14e2a3029229/init/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.829476 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-66fc59ccbf-knrgp_f6c54a70-a562-4fef-b3fe-14e2a3029229/dnsmasq-dns/0.log" Jan 27 12:10:28 crc kubenswrapper[4775]: I0127 12:10:28.879815 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-wskgh_e018489b-9445-4afb-8e4c-e9d52a6781d7/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.070473 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_899a9893-167d-4c9c-9495-3c663c7d0855/glance-httpd/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.092418 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_899a9893-167d-4c9c-9495-3c663c7d0855/glance-log/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.259123 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2d8a9ef1-1171-438f-be81-89f670bd9735/glance-httpd/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.282683 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_2d8a9ef1-1171-438f-be81-89f670bd9735/glance-log/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.516287 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6546ffcc78-4zdnk_00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4/horizon/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.649960 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-czgtf_d002bd2d-2dcd-4ba3-841b-1306c023469b/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.775867 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6546ffcc78-4zdnk_00b7bcb5-4219-488d-9ea6-7b0fe7bb93f4/horizon-log/0.log" Jan 27 12:10:29 crc kubenswrapper[4775]: I0127 12:10:29.858212 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-87v8z_2a28c09e-4891-433d-a745-f3dcfc8654aa/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:30 crc kubenswrapper[4775]: I0127 12:10:30.055624 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29491921-2bnsm_5ce874bb-50b0-4a56-a322-f5590c1d19bd/keystone-cron/0.log" Jan 27 12:10:30 crc kubenswrapper[4775]: I0127 12:10:30.127289 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5994598694-dhq5v_94f53f42-a5fc-45f9-b94c-4f12b63d8d75/keystone-api/0.log" Jan 27 12:10:30 crc kubenswrapper[4775]: I0127 12:10:30.262888 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_7aa68248-0707-4f5c-8689-57cf6d07c250/kube-state-metrics/0.log" Jan 27 12:10:30 crc kubenswrapper[4775]: I0127 12:10:30.360331 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-cn9zm_7ab3ce35-77fe-4e38-ad60-c5906f6d061a/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:30 crc kubenswrapper[4775]: I0127 12:10:30.640350 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5c59c678b7-lbtkp_857ed116-b219-4af4-9c38-69e85db0c484/neutron-api/0.log" Jan 27 12:10:30 crc kubenswrapper[4775]: I0127 12:10:30.760267 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5c59c678b7-lbtkp_857ed116-b219-4af4-9c38-69e85db0c484/neutron-httpd/0.log" Jan 27 12:10:30 crc kubenswrapper[4775]: I0127 12:10:30.869838 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-mxw97_352eaecd-6d51-4198-b3e6-ce59a6485be1/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:31 crc kubenswrapper[4775]: I0127 12:10:31.418315 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_451ba9e3-91a7-4fd5-9e95-b827186dee9d/nova-api-log/0.log" Jan 27 12:10:31 crc kubenswrapper[4775]: I0127 12:10:31.485992 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_21548904-8b74-4b9b-81fb-df04e62dc7df/nova-cell0-conductor-conductor/0.log" Jan 27 12:10:31 crc kubenswrapper[4775]: I0127 12:10:31.604351 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_451ba9e3-91a7-4fd5-9e95-b827186dee9d/nova-api-api/0.log" Jan 27 12:10:31 crc kubenswrapper[4775]: I0127 12:10:31.714743 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_c8d213b2-8a0b-479c-8c94-148f1afe1db0/nova-cell1-conductor-conductor/0.log" Jan 27 12:10:31 crc kubenswrapper[4775]: I0127 12:10:31.895413 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_80ce7ac7-056a-44ec-be77-f87a96dc23f5/nova-cell1-novncproxy-novncproxy/0.log" Jan 27 12:10:31 crc kubenswrapper[4775]: I0127 12:10:31.955663 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-27lb2_36bee79d-4a97-407b-9907-87d740929ba0/nova-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:32 crc kubenswrapper[4775]: I0127 12:10:32.254395 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b95ff32a-7b7f-43d8-b521-6d07c8d78c99/nova-metadata-log/0.log" Jan 27 12:10:32 crc kubenswrapper[4775]: I0127 12:10:32.402122 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a4732753-3f10-4604-89d0-0c074829e53f/nova-scheduler-scheduler/0.log" Jan 27 12:10:32 crc kubenswrapper[4775]: I0127 12:10:32.479153 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6108f26d-5e0a-490c-a7a4-8cefa3b99c7d/mysql-bootstrap/0.log" Jan 27 12:10:32 crc kubenswrapper[4775]: I0127 12:10:32.771118 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6108f26d-5e0a-490c-a7a4-8cefa3b99c7d/galera/0.log" Jan 27 12:10:32 crc kubenswrapper[4775]: I0127 12:10:32.804126 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_6108f26d-5e0a-490c-a7a4-8cefa3b99c7d/mysql-bootstrap/0.log" Jan 27 12:10:32 crc kubenswrapper[4775]: I0127 12:10:32.991311 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9bafbfb6-d113-4a0f-a1dd-0d001a5448de/mysql-bootstrap/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.118658 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9bafbfb6-d113-4a0f-a1dd-0d001a5448de/mysql-bootstrap/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.125137 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b95ff32a-7b7f-43d8-b521-6d07c8d78c99/nova-metadata-metadata/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.199267 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9bafbfb6-d113-4a0f-a1dd-0d001a5448de/galera/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.334149 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_db40a4a8-ce91-40a6-8b63-ccc17ed327da/openstackclient/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.422895 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-4hqln_cacc7142-a8d4-4607-adb7-0090fbd3024a/ovn-controller/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.580918 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9xncr_7a8845ad-fbbf-4dab-9ed8-0dbce1a8bad5/openstack-network-exporter/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.703966 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l9blz_b06b991d-b108-4b21-82e5-43b3662c7aee/ovsdb-server-init/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.858710 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l9blz_b06b991d-b108-4b21-82e5-43b3662c7aee/ovs-vswitchd/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.937873 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l9blz_b06b991d-b108-4b21-82e5-43b3662c7aee/ovsdb-server-init/0.log" Jan 27 12:10:33 crc kubenswrapper[4775]: I0127 12:10:33.946838 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-l9blz_b06b991d-b108-4b21-82e5-43b3662c7aee/ovsdb-server/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.112298 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-2p96g_41359e3c-21d7-4c22-bcef-0968c2f8cca5/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.205953 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6bb656eb-1eea-436d-acf3-6d8a548a97e5/openstack-network-exporter/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.278654 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6bb656eb-1eea-436d-acf3-6d8a548a97e5/ovn-northd/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.434879 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_09719e3d-fd6c-4c22-8c15-8ef911bc6598/ovsdbserver-nb/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.523381 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_09719e3d-fd6c-4c22-8c15-8ef911bc6598/openstack-network-exporter/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.656353 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fb252ada-9191-4d2d-8ab9-d12f4668a35a/openstack-network-exporter/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.665312 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fb252ada-9191-4d2d-8ab9-d12f4668a35a/ovsdbserver-sb/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.885578 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-f49dbf586-l2cmp_2b3edac4-ba7b-4c93-b66f-43ab468d290f/placement-api/0.log" Jan 27 12:10:34 crc kubenswrapper[4775]: I0127 12:10:34.999562 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-f49dbf586-l2cmp_2b3edac4-ba7b-4c93-b66f-43ab468d290f/placement-log/0.log" Jan 27 12:10:35 crc kubenswrapper[4775]: I0127 12:10:35.039236 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d/setup-container/0.log" Jan 27 12:10:35 crc kubenswrapper[4775]: I0127 12:10:35.324908 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d/setup-container/0.log" Jan 27 12:10:35 crc kubenswrapper[4775]: I0127 12:10:35.496886 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6c46c48a-ba77-4494-bc4e-f463a4072952/setup-container/0.log" Jan 27 12:10:35 crc kubenswrapper[4775]: I0127 12:10:35.526500 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_bfdc849b-c2c6-44c1-b1cc-a8ea59ed459d/rabbitmq/0.log" Jan 27 12:10:35 crc kubenswrapper[4775]: I0127 12:10:35.684085 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6c46c48a-ba77-4494-bc4e-f463a4072952/setup-container/0.log" Jan 27 12:10:35 crc kubenswrapper[4775]: I0127 12:10:35.707804 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_6c46c48a-ba77-4494-bc4e-f463a4072952/rabbitmq/0.log" Jan 27 12:10:35 crc kubenswrapper[4775]: I0127 12:10:35.817792 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-8p5tw_ca771db8-558f-4e69-ba8c-37ed97f534b4/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:35.999911 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-thhgd_e2226633-918b-423c-a329-bfd52943a1b0/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.152607 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-97lnm_ca504c80-0ad1-42d1-b7d7-9d8ae70b8ae0/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.258704 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-fvf2b_f349798f-861c-4071-b418-61fe20227133/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.383658 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-r78nv_28d386bc-d48d-41e0-9ae2-bbe8f876ba10/ssh-known-hosts-edpm-deployment/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.644180 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-66648b46df-hskmp_e22ddb6f-e33b-41ea-a24f-c97c0676e6e5/proxy-httpd/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.650153 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-66648b46df-hskmp_e22ddb6f-e33b-41ea-a24f-c97c0676e6e5/proxy-server/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.713638 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-7bdl6_aa44a018-6958-4bee-895d-e7ec3966be8d/swift-ring-rebalance/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.910922 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/account-auditor/0.log" Jan 27 12:10:36 crc kubenswrapper[4775]: I0127 12:10:36.959657 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/account-reaper/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.013710 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/account-replicator/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.174084 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/container-auditor/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.181536 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/account-server/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.220488 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/container-replicator/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.222199 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/container-server/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.396600 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/container-updater/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.399730 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/object-auditor/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.422541 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/object-expirer/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.524460 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/object-replicator/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.608232 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/object-updater/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.646776 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/rsync/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.698785 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/object-server/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.721538 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_b2f2b115-8dea-4dfa-a28e-5322f8fb8274/swift-recon-cron/0.log" Jan 27 12:10:37 crc kubenswrapper[4775]: I0127 12:10:37.744799 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:10:37 crc kubenswrapper[4775]: E0127 12:10:37.745093 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:10:38 crc kubenswrapper[4775]: I0127 12:10:38.013926 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_ad1bd97e-e6eb-4923-81fc-d4fd2cfb02b4/tempest-tests-tempest-tests-runner/0.log" Jan 27 12:10:38 crc kubenswrapper[4775]: I0127 12:10:38.032309 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-trmfd_c5bab8d8-2ee4-4499-aa5a-9fe4f21ad398/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:38 crc kubenswrapper[4775]: I0127 12:10:38.276636 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9d17d9d1-39f7-417c-b058-cda582c7f7d3/test-operator-logs-container/0.log" Jan 27 12:10:38 crc kubenswrapper[4775]: I0127 12:10:38.303949 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kg4j8_6b092f27-cfd0-4c25-beab-c347f14371a1/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 27 12:10:43 crc kubenswrapper[4775]: I0127 12:10:43.660083 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_07cc1808-c408-433d-aefa-f603408de606/memcached/0.log" Jan 27 12:10:51 crc kubenswrapper[4775]: I0127 12:10:51.753133 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:10:51 crc kubenswrapper[4775]: E0127 12:10:51.753869 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.305478 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9_dcd9d0e9-c9de-479d-b62f-f4403ffa22dd/util/0.log" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.514384 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9_dcd9d0e9-c9de-479d-b62f-f4403ffa22dd/util/0.log" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.527092 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9_dcd9d0e9-c9de-479d-b62f-f4403ffa22dd/pull/0.log" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.564197 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9_dcd9d0e9-c9de-479d-b62f-f4403ffa22dd/pull/0.log" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.677020 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9_dcd9d0e9-c9de-479d-b62f-f4403ffa22dd/util/0.log" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.702014 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9_dcd9d0e9-c9de-479d-b62f-f4403ffa22dd/pull/0.log" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.712635 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b8592b4bee85f54b7cb0d13a293b9a39b43f007684a97ff40ca91db6cftm5c9_dcd9d0e9-c9de-479d-b62f-f4403ffa22dd/extract/0.log" Jan 27 12:11:02 crc kubenswrapper[4775]: I0127 12:11:02.950533 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5fdc687f5-9wc4j_f04fa2a0-7af2-439a-9169-6edf5be65b35/manager/0.log" Jan 27 12:11:03 crc kubenswrapper[4775]: I0127 12:11:03.164675 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-76d4d5b8f9-dvj9s_c31d5b06-1ad2-4914-96c1-e0f0b8c4974e/manager/0.log" Jan 27 12:11:03 crc kubenswrapper[4775]: I0127 12:11:03.430260 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84d5bb46b-cvp5b_0cabb338-c4a1-41b4-abd6-d535b0e88406/manager/0.log" Jan 27 12:11:03 crc kubenswrapper[4775]: I0127 12:11:03.432259 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-658dd65b86-jp5c7_dd9264fb-034f-46d3-8698-dcc6fc3470f6/manager/0.log" Jan 27 12:11:03 crc kubenswrapper[4775]: I0127 12:11:03.726929 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-7f5ddd8d7b-58qnd_703a739a-6687-4324-b937-7d0efe7c143b/manager/0.log" Jan 27 12:11:03 crc kubenswrapper[4775]: I0127 12:11:03.746297 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:11:03 crc kubenswrapper[4775]: E0127 12:11:03.746575 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:11:04 crc kubenswrapper[4775]: I0127 12:11:04.004933 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-58865f87b4-s2l5z_b296a3cd-1dc1-4511-af7a-7b1801e23e61/manager/0.log" Jan 27 12:11:04 crc kubenswrapper[4775]: I0127 12:11:04.300461 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-54ccf4f85d-d7vhk_0da235e3-e76a-408f-8e0e-3cdd7ce76705/manager/0.log" Jan 27 12:11:04 crc kubenswrapper[4775]: I0127 12:11:04.352472 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-78f8b7b89c-2wqgg_4e719fbd-ac18-4ae1-bac6-c42f1e081daa/manager/0.log" Jan 27 12:11:04 crc kubenswrapper[4775]: I0127 12:11:04.505008 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-78b8f8fd84-8xrd7_6c5084e4-b0e1-46fd-ae69-c0f2ede3db17/manager/0.log" Jan 27 12:11:04 crc kubenswrapper[4775]: I0127 12:11:04.769856 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b88bfc995-tzn2s_56fb2890-7d29-452c-9f24-4aa20d977f0b/manager/0.log" Jan 27 12:11:04 crc kubenswrapper[4775]: I0127 12:11:04.936930 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-569695f6c5-pmk9t_6bcdd59a-9739-40e7-9625-3e56009dcbd7/manager/0.log" Jan 27 12:11:05 crc kubenswrapper[4775]: I0127 12:11:05.189351 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74ffd97575-cln8g_2a55fa83-c395-4ac2-bc2e-355ad48a4a95/manager/0.log" Jan 27 12:11:05 crc kubenswrapper[4775]: I0127 12:11:05.429567 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7bd95ffd6dlkxb8_3e47cb1c-7f01-4b8d-904f-fed543678a02/manager/0.log" Jan 27 12:11:05 crc kubenswrapper[4775]: I0127 12:11:05.888292 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6bfcf7b875-z4vw8_8868fb89-f25b-48ef-b4e2-9acab9f78790/operator/0.log" Jan 27 12:11:06 crc kubenswrapper[4775]: I0127 12:11:06.539210 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-swjcb_56b44f0b-813c-4626-a8ec-54ac78bbb086/registry-server/0.log" Jan 27 12:11:06 crc kubenswrapper[4775]: I0127 12:11:06.776602 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bf6d4f946-p9vts_701902fe-7e51-44b6-923b-0a60c96d6707/manager/0.log" Jan 27 12:11:06 crc kubenswrapper[4775]: I0127 12:11:06.777947 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7bf4858b78-fcd9x_7df5397d-0c1f-46b4-8695-d80c752ca569/manager/0.log" Jan 27 12:11:06 crc kubenswrapper[4775]: I0127 12:11:06.983093 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-7748d79f84-vmtx4_e14198f0-3413-4350-bae5-33b23ceead05/manager/0.log" Jan 27 12:11:07 crc kubenswrapper[4775]: I0127 12:11:07.058735 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-g5nsq_a5e8d398-7976-4603-8409-304fa193f7f1/operator/0.log" Jan 27 12:11:07 crc kubenswrapper[4775]: I0127 12:11:07.335952 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-65596dbf77-9sfp8_909c9a87-2eb1-4a52-b86d-6d36524b1eb2/manager/0.log" Jan 27 12:11:07 crc kubenswrapper[4775]: I0127 12:11:07.537034 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7db57dc8bf-5lbbt_01a03f23-ead5-4a15-976f-4dda2622083b/manager/0.log" Jan 27 12:11:07 crc kubenswrapper[4775]: I0127 12:11:07.697967 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-6c866cfdcb-2mz97_5070c545-d4c0-46b3-afb9-c130dc982406/manager/0.log" Jan 27 12:11:07 crc kubenswrapper[4775]: I0127 12:11:07.835330 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6476466c7c-lb4h8_bea84175-0947-45e5-a635-b7d32a0442c6/manager/0.log" Jan 27 12:11:08 crc kubenswrapper[4775]: I0127 12:11:08.183740 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-76958f4d87-8js8k_2ecfe007-a4bf-4c31-bc83-36f4c5f00815/manager/0.log" Jan 27 12:11:10 crc kubenswrapper[4775]: I0127 12:11:10.397842 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-75b8f798ff-t29z2_04cbcc0c-4375-44f0-9461-b43492e9d95b/manager/0.log" Jan 27 12:11:16 crc kubenswrapper[4775]: I0127 12:11:16.745160 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:11:16 crc kubenswrapper[4775]: E0127 12:11:16.745910 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:11:28 crc kubenswrapper[4775]: I0127 12:11:28.744726 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:11:28 crc kubenswrapper[4775]: E0127 12:11:28.745858 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:11:29 crc kubenswrapper[4775]: I0127 12:11:29.444041 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-gl7ql_87a94d4a-7341-4e6c-8194-a2e6832dbb01/control-plane-machine-set-operator/0.log" Jan 27 12:11:29 crc kubenswrapper[4775]: I0127 12:11:29.653872 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-sknjj_f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd/machine-api-operator/0.log" Jan 27 12:11:29 crc kubenswrapper[4775]: I0127 12:11:29.709248 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-sknjj_f438c1d7-d1f7-4ce7-9387-ca3d7cdfe7dd/kube-rbac-proxy/0.log" Jan 27 12:11:41 crc kubenswrapper[4775]: I0127 12:11:41.761967 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:11:41 crc kubenswrapper[4775]: E0127 12:11:41.762732 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:11:42 crc kubenswrapper[4775]: I0127 12:11:42.023137 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-xpr9c_6b64e5cd-1b80-489b-8d69-3ebf7862eb9f/cert-manager-controller/0.log" Jan 27 12:11:42 crc kubenswrapper[4775]: I0127 12:11:42.229985 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-4sq7k_ea378b66-945f-4832-b293-59576474b63c/cert-manager-cainjector/0.log" Jan 27 12:11:42 crc kubenswrapper[4775]: I0127 12:11:42.300639 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-5w45m_882dbf86-77c4-46a5-a75b-b7b4a70d3ac1/cert-manager-webhook/0.log" Jan 27 12:11:54 crc kubenswrapper[4775]: I0127 12:11:54.077356 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-tm9vw_76d9c92d-c012-448b-8ff5-00f10c17c5a7/nmstate-console-plugin/0.log" Jan 27 12:11:54 crc kubenswrapper[4775]: I0127 12:11:54.291167 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-4vtwf_0aa6cbcb-077f-4ae7-85b2-d79679ef64df/nmstate-handler/0.log" Jan 27 12:11:54 crc kubenswrapper[4775]: I0127 12:11:54.344712 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-2qhwx_4c84a5ec-b41d-4396-adea-3c9964cc7c59/kube-rbac-proxy/0.log" Jan 27 12:11:54 crc kubenswrapper[4775]: I0127 12:11:54.406067 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-2qhwx_4c84a5ec-b41d-4396-adea-3c9964cc7c59/nmstate-metrics/0.log" Jan 27 12:11:54 crc kubenswrapper[4775]: I0127 12:11:54.537976 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-znzng_cc4143f9-7f09-4aed-ba2c-29c7e74c5b2f/nmstate-operator/0.log" Jan 27 12:11:54 crc kubenswrapper[4775]: I0127 12:11:54.628976 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-d9lzh_d9f9feec-ee04-44de-8879-4071243ac6db/nmstate-webhook/0.log" Jan 27 12:11:55 crc kubenswrapper[4775]: I0127 12:11:55.745340 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:11:55 crc kubenswrapper[4775]: E0127 12:11:55.745743 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:12:06 crc kubenswrapper[4775]: I0127 12:12:06.744981 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:12:06 crc kubenswrapper[4775]: E0127 12:12:06.745920 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:12:19 crc kubenswrapper[4775]: I0127 12:12:19.745525 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:12:19 crc kubenswrapper[4775]: E0127 12:12:19.746253 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.012828 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-4tjsf_6bd75754-cf96-4b57-bfd3-711aa3dc06e6/kube-rbac-proxy/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.149664 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-4tjsf_6bd75754-cf96-4b57-bfd3-711aa3dc06e6/controller/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.223716 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-frr-files/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.461727 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-frr-files/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.479694 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-reloader/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.491172 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-reloader/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.494672 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-metrics/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.728079 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-frr-files/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.739790 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-reloader/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.781840 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-metrics/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.833178 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-metrics/0.log" Jan 27 12:12:21 crc kubenswrapper[4775]: I0127 12:12:21.985898 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-metrics/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.010211 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-reloader/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.019223 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/cp-frr-files/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.055434 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/controller/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.217133 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/frr-metrics/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.256759 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/kube-rbac-proxy/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.307319 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/kube-rbac-proxy-frr/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.469751 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/reloader/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.631847 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-ht6jz_de8a1d9c-9c8b-4200-92ae-b82c65b24d56/frr-k8s-webhook-server/0.log" Jan 27 12:12:22 crc kubenswrapper[4775]: I0127 12:12:22.844223 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7c8c7fc46c-g7l74_7560029a-575e-4d87-b4e8-4f090c5a7cd9/manager/0.log" Jan 27 12:12:23 crc kubenswrapper[4775]: I0127 12:12:23.020972 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6b85bfbbbb-bb966_acb19b04-4cd3-4304-a572-d25d4aa2932b/webhook-server/0.log" Jan 27 12:12:23 crc kubenswrapper[4775]: I0127 12:12:23.190096 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qm9dq_5573a041-6f7e-4c23-b2ea-42de01c96cdd/kube-rbac-proxy/0.log" Jan 27 12:12:23 crc kubenswrapper[4775]: I0127 12:12:23.966702 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-qm9dq_5573a041-6f7e-4c23-b2ea-42de01c96cdd/speaker/0.log" Jan 27 12:12:24 crc kubenswrapper[4775]: I0127 12:12:24.034750 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-52txr_ac3b8043-04c7-4036-9dc5-6068d914356c/frr/0.log" Jan 27 12:12:34 crc kubenswrapper[4775]: I0127 12:12:34.745679 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:12:34 crc kubenswrapper[4775]: E0127 12:12:34.746520 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.090406 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk_99ed53a2-63f4-4636-b581-2a686d44d5d0/util/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.336862 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk_99ed53a2-63f4-4636-b581-2a686d44d5d0/pull/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.337010 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk_99ed53a2-63f4-4636-b581-2a686d44d5d0/pull/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.351627 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk_99ed53a2-63f4-4636-b581-2a686d44d5d0/util/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.518769 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk_99ed53a2-63f4-4636-b581-2a686d44d5d0/util/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.526851 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk_99ed53a2-63f4-4636-b581-2a686d44d5d0/extract/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.548945 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dclzdvk_99ed53a2-63f4-4636-b581-2a686d44d5d0/pull/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.705310 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8_252d02e0-ca7d-405f-8315-3588f55a7b0c/util/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.892716 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8_252d02e0-ca7d-405f-8315-3588f55a7b0c/pull/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.904573 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8_252d02e0-ca7d-405f-8315-3588f55a7b0c/pull/0.log" Jan 27 12:12:36 crc kubenswrapper[4775]: I0127 12:12:36.947944 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8_252d02e0-ca7d-405f-8315-3588f55a7b0c/util/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.125639 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8_252d02e0-ca7d-405f-8315-3588f55a7b0c/extract/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.151152 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8_252d02e0-ca7d-405f-8315-3588f55a7b0c/util/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.172569 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec7137xls8_252d02e0-ca7d-405f-8315-3588f55a7b0c/pull/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.304485 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5mgmj_b55d8922-b4e4-4162-acbe-4294c4746204/extract-utilities/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.492604 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5mgmj_b55d8922-b4e4-4162-acbe-4294c4746204/extract-utilities/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.509061 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5mgmj_b55d8922-b4e4-4162-acbe-4294c4746204/extract-content/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.524731 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5mgmj_b55d8922-b4e4-4162-acbe-4294c4746204/extract-content/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.676976 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5mgmj_b55d8922-b4e4-4162-acbe-4294c4746204/extract-utilities/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.741914 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5mgmj_b55d8922-b4e4-4162-acbe-4294c4746204/extract-content/0.log" Jan 27 12:12:37 crc kubenswrapper[4775]: I0127 12:12:37.901878 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-klf7d_30eb115d-82ef-4c37-8cf4-4f2945ad86c1/extract-utilities/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.069307 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-5mgmj_b55d8922-b4e4-4162-acbe-4294c4746204/registry-server/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.172621 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-klf7d_30eb115d-82ef-4c37-8cf4-4f2945ad86c1/extract-content/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.177168 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-klf7d_30eb115d-82ef-4c37-8cf4-4f2945ad86c1/extract-content/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.211640 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-klf7d_30eb115d-82ef-4c37-8cf4-4f2945ad86c1/extract-utilities/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.381960 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-klf7d_30eb115d-82ef-4c37-8cf4-4f2945ad86c1/extract-content/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.414087 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-klf7d_30eb115d-82ef-4c37-8cf4-4f2945ad86c1/extract-utilities/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.630959 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qxmcq_fc92bcc5-aeca-4736-b861-e6f1540a15d1/marketplace-operator/1.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.712801 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qxmcq_fc92bcc5-aeca-4736-b861-e6f1540a15d1/marketplace-operator/2.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.844771 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-klf7d_30eb115d-82ef-4c37-8cf4-4f2945ad86c1/registry-server/0.log" Jan 27 12:12:38 crc kubenswrapper[4775]: I0127 12:12:38.871369 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xbvgj_9db1a996-ad2f-460c-9d8d-cacc63c4924d/extract-utilities/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.025988 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xbvgj_9db1a996-ad2f-460c-9d8d-cacc63c4924d/extract-content/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.061738 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xbvgj_9db1a996-ad2f-460c-9d8d-cacc63c4924d/extract-content/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.061794 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xbvgj_9db1a996-ad2f-460c-9d8d-cacc63c4924d/extract-utilities/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.240276 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xbvgj_9db1a996-ad2f-460c-9d8d-cacc63c4924d/extract-content/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.244055 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xbvgj_9db1a996-ad2f-460c-9d8d-cacc63c4924d/extract-utilities/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.358232 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-xbvgj_9db1a996-ad2f-460c-9d8d-cacc63c4924d/registry-server/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.462549 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qp8_c6ef80c4-f4f3-4ba1-b98e-63738725009d/extract-utilities/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.635072 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qp8_c6ef80c4-f4f3-4ba1-b98e-63738725009d/extract-utilities/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.674788 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qp8_c6ef80c4-f4f3-4ba1-b98e-63738725009d/extract-content/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.696109 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qp8_c6ef80c4-f4f3-4ba1-b98e-63738725009d/extract-content/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.817806 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qp8_c6ef80c4-f4f3-4ba1-b98e-63738725009d/extract-content/0.log" Jan 27 12:12:39 crc kubenswrapper[4775]: I0127 12:12:39.855562 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qp8_c6ef80c4-f4f3-4ba1-b98e-63738725009d/extract-utilities/0.log" Jan 27 12:12:40 crc kubenswrapper[4775]: I0127 12:12:40.368916 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-87qp8_c6ef80c4-f4f3-4ba1-b98e-63738725009d/registry-server/0.log" Jan 27 12:12:48 crc kubenswrapper[4775]: I0127 12:12:48.745244 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:12:48 crc kubenswrapper[4775]: E0127 12:12:48.747173 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:13:00 crc kubenswrapper[4775]: I0127 12:13:00.744960 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:13:00 crc kubenswrapper[4775]: E0127 12:13:00.745753 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:13:11 crc kubenswrapper[4775]: I0127 12:13:11.753783 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:13:11 crc kubenswrapper[4775]: E0127 12:13:11.754737 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:13:26 crc kubenswrapper[4775]: I0127 12:13:26.745267 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:13:26 crc kubenswrapper[4775]: E0127 12:13:26.746181 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:13:41 crc kubenswrapper[4775]: I0127 12:13:41.753182 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:13:41 crc kubenswrapper[4775]: E0127 12:13:41.754091 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:13:54 crc kubenswrapper[4775]: I0127 12:13:54.751439 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:13:54 crc kubenswrapper[4775]: E0127 12:13:54.753946 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:14:09 crc kubenswrapper[4775]: I0127 12:14:09.747964 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:14:09 crc kubenswrapper[4775]: E0127 12:14:09.749084 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:14:21 crc kubenswrapper[4775]: I0127 12:14:21.754055 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:14:21 crc kubenswrapper[4775]: E0127 12:14:21.755026 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:14:30 crc kubenswrapper[4775]: I0127 12:14:30.007475 4775 generic.go:334] "Generic (PLEG): container finished" podID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerID="14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b" exitCode=0 Jan 27 12:14:30 crc kubenswrapper[4775]: I0127 12:14:30.007580 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7p92f/must-gather-wqwn4" event={"ID":"09caf0cd-6a8c-41d8-84a7-7813e19a373a","Type":"ContainerDied","Data":"14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b"} Jan 27 12:14:30 crc kubenswrapper[4775]: I0127 12:14:30.008761 4775 scope.go:117] "RemoveContainer" containerID="14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b" Jan 27 12:14:30 crc kubenswrapper[4775]: I0127 12:14:30.502007 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7p92f_must-gather-wqwn4_09caf0cd-6a8c-41d8-84a7-7813e19a373a/gather/0.log" Jan 27 12:14:34 crc kubenswrapper[4775]: I0127 12:14:34.745858 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:14:34 crc kubenswrapper[4775]: E0127 12:14:34.746713 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:14:39 crc kubenswrapper[4775]: I0127 12:14:39.377511 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7p92f/must-gather-wqwn4"] Jan 27 12:14:39 crc kubenswrapper[4775]: I0127 12:14:39.378355 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7p92f/must-gather-wqwn4" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerName="copy" containerID="cri-o://d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3" gracePeriod=2 Jan 27 12:14:39 crc kubenswrapper[4775]: I0127 12:14:39.403840 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7p92f/must-gather-wqwn4"] Jan 27 12:14:39 crc kubenswrapper[4775]: I0127 12:14:39.905927 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7p92f_must-gather-wqwn4_09caf0cd-6a8c-41d8-84a7-7813e19a373a/copy/0.log" Jan 27 12:14:39 crc kubenswrapper[4775]: I0127 12:14:39.906576 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.042639 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09caf0cd-6a8c-41d8-84a7-7813e19a373a-must-gather-output\") pod \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.042813 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84zh6\" (UniqueName: \"kubernetes.io/projected/09caf0cd-6a8c-41d8-84a7-7813e19a373a-kube-api-access-84zh6\") pod \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\" (UID: \"09caf0cd-6a8c-41d8-84a7-7813e19a373a\") " Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.061356 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09caf0cd-6a8c-41d8-84a7-7813e19a373a-kube-api-access-84zh6" (OuterVolumeSpecName: "kube-api-access-84zh6") pod "09caf0cd-6a8c-41d8-84a7-7813e19a373a" (UID: "09caf0cd-6a8c-41d8-84a7-7813e19a373a"). InnerVolumeSpecName "kube-api-access-84zh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.102237 4775 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7p92f_must-gather-wqwn4_09caf0cd-6a8c-41d8-84a7-7813e19a373a/copy/0.log" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.102578 4775 generic.go:334] "Generic (PLEG): container finished" podID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerID="d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3" exitCode=143 Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.102631 4775 scope.go:117] "RemoveContainer" containerID="d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.102775 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7p92f/must-gather-wqwn4" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.132147 4775 scope.go:117] "RemoveContainer" containerID="14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.145246 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84zh6\" (UniqueName: \"kubernetes.io/projected/09caf0cd-6a8c-41d8-84a7-7813e19a373a-kube-api-access-84zh6\") on node \"crc\" DevicePath \"\"" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.226170 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09caf0cd-6a8c-41d8-84a7-7813e19a373a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "09caf0cd-6a8c-41d8-84a7-7813e19a373a" (UID: "09caf0cd-6a8c-41d8-84a7-7813e19a373a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.246756 4775 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09caf0cd-6a8c-41d8-84a7-7813e19a373a-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.256117 4775 scope.go:117] "RemoveContainer" containerID="d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3" Jan 27 12:14:40 crc kubenswrapper[4775]: E0127 12:14:40.257909 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3\": container with ID starting with d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3 not found: ID does not exist" containerID="d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.257970 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3"} err="failed to get container status \"d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3\": rpc error: code = NotFound desc = could not find container \"d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3\": container with ID starting with d0bee6bbd0213c5355f6d524c531a14a570c8443edc2289d72b0df5984a888f3 not found: ID does not exist" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.258000 4775 scope.go:117] "RemoveContainer" containerID="14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b" Jan 27 12:14:40 crc kubenswrapper[4775]: E0127 12:14:40.259100 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b\": container with ID starting with 14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b not found: ID does not exist" containerID="14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b" Jan 27 12:14:40 crc kubenswrapper[4775]: I0127 12:14:40.259158 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b"} err="failed to get container status \"14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b\": rpc error: code = NotFound desc = could not find container \"14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b\": container with ID starting with 14945994ed90da9cb454811da0615d569dfb97b4d44357d06052738cda61ab2b not found: ID does not exist" Jan 27 12:14:41 crc kubenswrapper[4775]: I0127 12:14:41.760479 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" path="/var/lib/kubelet/pods/09caf0cd-6a8c-41d8-84a7-7813e19a373a/volumes" Jan 27 12:14:49 crc kubenswrapper[4775]: I0127 12:14:49.745582 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:14:49 crc kubenswrapper[4775]: E0127 12:14:49.747555 4775 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qn99x_openshift-machine-config-operator(7707cf23-0a23-4f57-8184-f7a4f7587aa2)\"" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.223574 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n5z9l"] Jan 27 12:14:51 crc kubenswrapper[4775]: E0127 12:14:51.224789 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68e8da4d-550a-40eb-b851-4e7f2b637352" containerName="container-00" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.224814 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="68e8da4d-550a-40eb-b851-4e7f2b637352" containerName="container-00" Jan 27 12:14:51 crc kubenswrapper[4775]: E0127 12:14:51.224832 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerName="copy" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.224840 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerName="copy" Jan 27 12:14:51 crc kubenswrapper[4775]: E0127 12:14:51.224855 4775 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerName="gather" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.224894 4775 state_mem.go:107] "Deleted CPUSet assignment" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerName="gather" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.225147 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerName="gather" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.225164 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="09caf0cd-6a8c-41d8-84a7-7813e19a373a" containerName="copy" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.225187 4775 memory_manager.go:354] "RemoveStaleState removing state" podUID="68e8da4d-550a-40eb-b851-4e7f2b637352" containerName="container-00" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.227109 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.242710 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n5z9l"] Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.377295 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-catalog-content\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.377386 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78vzs\" (UniqueName: \"kubernetes.io/projected/cbff1de5-dd70-4733-8a6f-8538d9940aee-kube-api-access-78vzs\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.377538 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-utilities\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.480634 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-catalog-content\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.480755 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78vzs\" (UniqueName: \"kubernetes.io/projected/cbff1de5-dd70-4733-8a6f-8538d9940aee-kube-api-access-78vzs\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.480929 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-utilities\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.481521 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-catalog-content\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.481723 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-utilities\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.510948 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78vzs\" (UniqueName: \"kubernetes.io/projected/cbff1de5-dd70-4733-8a6f-8538d9940aee-kube-api-access-78vzs\") pod \"community-operators-n5z9l\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.563031 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:14:51 crc kubenswrapper[4775]: I0127 12:14:51.944785 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n5z9l"] Jan 27 12:14:52 crc kubenswrapper[4775]: I0127 12:14:52.216588 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerStarted","Data":"8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209"} Jan 27 12:14:52 crc kubenswrapper[4775]: I0127 12:14:52.216637 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerStarted","Data":"e10b0c1057c8fe3dc02028a5148e954d203d6bc10424ddb7a56cc51b4b561ace"} Jan 27 12:14:53 crc kubenswrapper[4775]: I0127 12:14:53.225968 4775 generic.go:334] "Generic (PLEG): container finished" podID="cbff1de5-dd70-4733-8a6f-8538d9940aee" containerID="8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209" exitCode=0 Jan 27 12:14:53 crc kubenswrapper[4775]: I0127 12:14:53.226109 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerDied","Data":"8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209"} Jan 27 12:14:53 crc kubenswrapper[4775]: I0127 12:14:53.229105 4775 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 12:14:54 crc kubenswrapper[4775]: I0127 12:14:54.236858 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerStarted","Data":"8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30"} Jan 27 12:14:58 crc kubenswrapper[4775]: I0127 12:14:58.268944 4775 generic.go:334] "Generic (PLEG): container finished" podID="cbff1de5-dd70-4733-8a6f-8538d9940aee" containerID="8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30" exitCode=0 Jan 27 12:14:58 crc kubenswrapper[4775]: I0127 12:14:58.269043 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerDied","Data":"8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30"} Jan 27 12:14:59 crc kubenswrapper[4775]: I0127 12:14:59.280900 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerStarted","Data":"b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03"} Jan 27 12:14:59 crc kubenswrapper[4775]: I0127 12:14:59.311914 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n5z9l" podStartSLOduration=2.581519847 podStartE2EDuration="8.311886425s" podCreationTimestamp="2026-01-27 12:14:51 +0000 UTC" firstStartedPulling="2026-01-27 12:14:53.228908937 +0000 UTC m=+3272.370506704" lastFinishedPulling="2026-01-27 12:14:58.959275505 +0000 UTC m=+3278.100873282" observedRunningTime="2026-01-27 12:14:59.303823117 +0000 UTC m=+3278.445420914" watchObservedRunningTime="2026-01-27 12:14:59.311886425 +0000 UTC m=+3278.453484202" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.156943 4775 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89"] Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.158969 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.161562 4775 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.162216 4775 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.190099 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89"] Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.262114 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqpd7\" (UniqueName: \"kubernetes.io/projected/c654a0c1-9099-4854-bf80-86bf948aac80-kube-api-access-qqpd7\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.262505 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c654a0c1-9099-4854-bf80-86bf948aac80-config-volume\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.262565 4775 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c654a0c1-9099-4854-bf80-86bf948aac80-secret-volume\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.365090 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c654a0c1-9099-4854-bf80-86bf948aac80-config-volume\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.365150 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c654a0c1-9099-4854-bf80-86bf948aac80-secret-volume\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.365279 4775 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqpd7\" (UniqueName: \"kubernetes.io/projected/c654a0c1-9099-4854-bf80-86bf948aac80-kube-api-access-qqpd7\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.366351 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c654a0c1-9099-4854-bf80-86bf948aac80-config-volume\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.371710 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c654a0c1-9099-4854-bf80-86bf948aac80-secret-volume\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.386983 4775 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqpd7\" (UniqueName: \"kubernetes.io/projected/c654a0c1-9099-4854-bf80-86bf948aac80-kube-api-access-qqpd7\") pod \"collect-profiles-29491935-vhz89\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.482233 4775 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:00 crc kubenswrapper[4775]: I0127 12:15:00.957157 4775 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89"] Jan 27 12:15:01 crc kubenswrapper[4775]: I0127 12:15:01.304006 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" event={"ID":"c654a0c1-9099-4854-bf80-86bf948aac80","Type":"ContainerStarted","Data":"c252616439a676bb9e6b06343c551fab6d8b758f7a28eff5e2046c4ca7050ea8"} Jan 27 12:15:01 crc kubenswrapper[4775]: I0127 12:15:01.304576 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" event={"ID":"c654a0c1-9099-4854-bf80-86bf948aac80","Type":"ContainerStarted","Data":"1be808e79337c5fa8ad48d85374ede604629a29f9ea354f83487d4c02d3a6319"} Jan 27 12:15:01 crc kubenswrapper[4775]: I0127 12:15:01.329641 4775 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" podStartSLOduration=1.32962109 podStartE2EDuration="1.32962109s" podCreationTimestamp="2026-01-27 12:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 12:15:01.328716705 +0000 UTC m=+3280.470314472" watchObservedRunningTime="2026-01-27 12:15:01.32962109 +0000 UTC m=+3280.471218867" Jan 27 12:15:01 crc kubenswrapper[4775]: I0127 12:15:01.563797 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:15:01 crc kubenswrapper[4775]: I0127 12:15:01.563856 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:15:01 crc kubenswrapper[4775]: I0127 12:15:01.608756 4775 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:15:02 crc kubenswrapper[4775]: I0127 12:15:02.314752 4775 generic.go:334] "Generic (PLEG): container finished" podID="c654a0c1-9099-4854-bf80-86bf948aac80" containerID="c252616439a676bb9e6b06343c551fab6d8b758f7a28eff5e2046c4ca7050ea8" exitCode=0 Jan 27 12:15:02 crc kubenswrapper[4775]: I0127 12:15:02.314823 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" event={"ID":"c654a0c1-9099-4854-bf80-86bf948aac80","Type":"ContainerDied","Data":"c252616439a676bb9e6b06343c551fab6d8b758f7a28eff5e2046c4ca7050ea8"} Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.666359 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.729354 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c654a0c1-9099-4854-bf80-86bf948aac80-config-volume\") pod \"c654a0c1-9099-4854-bf80-86bf948aac80\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.729435 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqpd7\" (UniqueName: \"kubernetes.io/projected/c654a0c1-9099-4854-bf80-86bf948aac80-kube-api-access-qqpd7\") pod \"c654a0c1-9099-4854-bf80-86bf948aac80\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.729619 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c654a0c1-9099-4854-bf80-86bf948aac80-secret-volume\") pod \"c654a0c1-9099-4854-bf80-86bf948aac80\" (UID: \"c654a0c1-9099-4854-bf80-86bf948aac80\") " Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.730289 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c654a0c1-9099-4854-bf80-86bf948aac80-config-volume" (OuterVolumeSpecName: "config-volume") pod "c654a0c1-9099-4854-bf80-86bf948aac80" (UID: "c654a0c1-9099-4854-bf80-86bf948aac80"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.735732 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c654a0c1-9099-4854-bf80-86bf948aac80-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c654a0c1-9099-4854-bf80-86bf948aac80" (UID: "c654a0c1-9099-4854-bf80-86bf948aac80"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.735839 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c654a0c1-9099-4854-bf80-86bf948aac80-kube-api-access-qqpd7" (OuterVolumeSpecName: "kube-api-access-qqpd7") pod "c654a0c1-9099-4854-bf80-86bf948aac80" (UID: "c654a0c1-9099-4854-bf80-86bf948aac80"). InnerVolumeSpecName "kube-api-access-qqpd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.833276 4775 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c654a0c1-9099-4854-bf80-86bf948aac80-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.833512 4775 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c654a0c1-9099-4854-bf80-86bf948aac80-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 12:15:03 crc kubenswrapper[4775]: I0127 12:15:03.833573 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqpd7\" (UniqueName: \"kubernetes.io/projected/c654a0c1-9099-4854-bf80-86bf948aac80-kube-api-access-qqpd7\") on node \"crc\" DevicePath \"\"" Jan 27 12:15:04 crc kubenswrapper[4775]: E0127 12:15:04.003772 4775 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc654a0c1_9099_4854_bf80_86bf948aac80.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc654a0c1_9099_4854_bf80_86bf948aac80.slice/crio-1be808e79337c5fa8ad48d85374ede604629a29f9ea354f83487d4c02d3a6319\": RecentStats: unable to find data in memory cache]" Jan 27 12:15:04 crc kubenswrapper[4775]: I0127 12:15:04.333322 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" event={"ID":"c654a0c1-9099-4854-bf80-86bf948aac80","Type":"ContainerDied","Data":"1be808e79337c5fa8ad48d85374ede604629a29f9ea354f83487d4c02d3a6319"} Jan 27 12:15:04 crc kubenswrapper[4775]: I0127 12:15:04.333357 4775 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1be808e79337c5fa8ad48d85374ede604629a29f9ea354f83487d4c02d3a6319" Jan 27 12:15:04 crc kubenswrapper[4775]: I0127 12:15:04.333690 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29491935-vhz89" Jan 27 12:15:04 crc kubenswrapper[4775]: I0127 12:15:04.400385 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj"] Jan 27 12:15:04 crc kubenswrapper[4775]: I0127 12:15:04.409715 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29491890-4glmj"] Jan 27 12:15:04 crc kubenswrapper[4775]: I0127 12:15:04.744951 4775 scope.go:117] "RemoveContainer" containerID="e06ec7cef9fa2cfcbfa30781541a11014b6d0ea3fc7eb9dd44c9778b2a10137a" Jan 27 12:15:05 crc kubenswrapper[4775]: I0127 12:15:05.756201 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fb6e2d5-5884-4a3b-84a1-88a5ee052da9" path="/var/lib/kubelet/pods/2fb6e2d5-5884-4a3b-84a1-88a5ee052da9/volumes" Jan 27 12:15:06 crc kubenswrapper[4775]: I0127 12:15:06.353638 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" event={"ID":"7707cf23-0a23-4f57-8184-f7a4f7587aa2","Type":"ContainerStarted","Data":"9b7ace790e0aa7d5a9cb6a8918be4ce2919c74f847ec7ba5948065c26c7daa93"} Jan 27 12:15:11 crc kubenswrapper[4775]: I0127 12:15:11.610248 4775 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:15:11 crc kubenswrapper[4775]: I0127 12:15:11.660936 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n5z9l"] Jan 27 12:15:12 crc kubenswrapper[4775]: I0127 12:15:12.406246 4775 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n5z9l" podUID="cbff1de5-dd70-4733-8a6f-8538d9940aee" containerName="registry-server" containerID="cri-o://b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03" gracePeriod=2 Jan 27 12:15:12 crc kubenswrapper[4775]: I0127 12:15:12.949604 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.119585 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-utilities\") pod \"cbff1de5-dd70-4733-8a6f-8538d9940aee\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.120191 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78vzs\" (UniqueName: \"kubernetes.io/projected/cbff1de5-dd70-4733-8a6f-8538d9940aee-kube-api-access-78vzs\") pod \"cbff1de5-dd70-4733-8a6f-8538d9940aee\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.120324 4775 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-catalog-content\") pod \"cbff1de5-dd70-4733-8a6f-8538d9940aee\" (UID: \"cbff1de5-dd70-4733-8a6f-8538d9940aee\") " Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.120761 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-utilities" (OuterVolumeSpecName: "utilities") pod "cbff1de5-dd70-4733-8a6f-8538d9940aee" (UID: "cbff1de5-dd70-4733-8a6f-8538d9940aee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.121061 4775 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.126924 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbff1de5-dd70-4733-8a6f-8538d9940aee-kube-api-access-78vzs" (OuterVolumeSpecName: "kube-api-access-78vzs") pod "cbff1de5-dd70-4733-8a6f-8538d9940aee" (UID: "cbff1de5-dd70-4733-8a6f-8538d9940aee"). InnerVolumeSpecName "kube-api-access-78vzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.173295 4775 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cbff1de5-dd70-4733-8a6f-8538d9940aee" (UID: "cbff1de5-dd70-4733-8a6f-8538d9940aee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.223290 4775 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78vzs\" (UniqueName: \"kubernetes.io/projected/cbff1de5-dd70-4733-8a6f-8538d9940aee-kube-api-access-78vzs\") on node \"crc\" DevicePath \"\"" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.223341 4775 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cbff1de5-dd70-4733-8a6f-8538d9940aee-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.416950 4775 generic.go:334] "Generic (PLEG): container finished" podID="cbff1de5-dd70-4733-8a6f-8538d9940aee" containerID="b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03" exitCode=0 Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.416994 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerDied","Data":"b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03"} Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.417020 4775 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n5z9l" event={"ID":"cbff1de5-dd70-4733-8a6f-8538d9940aee","Type":"ContainerDied","Data":"e10b0c1057c8fe3dc02028a5148e954d203d6bc10424ddb7a56cc51b4b561ace"} Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.417038 4775 scope.go:117] "RemoveContainer" containerID="b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.417148 4775 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n5z9l" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.447840 4775 scope.go:117] "RemoveContainer" containerID="8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.454397 4775 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n5z9l"] Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.466511 4775 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n5z9l"] Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.471303 4775 scope.go:117] "RemoveContainer" containerID="8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.514556 4775 scope.go:117] "RemoveContainer" containerID="b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03" Jan 27 12:15:13 crc kubenswrapper[4775]: E0127 12:15:13.514928 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03\": container with ID starting with b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03 not found: ID does not exist" containerID="b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.514983 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03"} err="failed to get container status \"b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03\": rpc error: code = NotFound desc = could not find container \"b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03\": container with ID starting with b8a6995e7c1c45aadd849147c0f891a64af00d844a6cd9d5bab229607e6f5d03 not found: ID does not exist" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.515017 4775 scope.go:117] "RemoveContainer" containerID="8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30" Jan 27 12:15:13 crc kubenswrapper[4775]: E0127 12:15:13.515377 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30\": container with ID starting with 8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30 not found: ID does not exist" containerID="8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.515429 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30"} err="failed to get container status \"8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30\": rpc error: code = NotFound desc = could not find container \"8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30\": container with ID starting with 8455aae42d21ee447b2c8e5210bb268fe34264cc674ab0c9d59e6f32788cea30 not found: ID does not exist" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.515481 4775 scope.go:117] "RemoveContainer" containerID="8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209" Jan 27 12:15:13 crc kubenswrapper[4775]: E0127 12:15:13.515817 4775 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209\": container with ID starting with 8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209 not found: ID does not exist" containerID="8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.515840 4775 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209"} err="failed to get container status \"8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209\": rpc error: code = NotFound desc = could not find container \"8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209\": container with ID starting with 8a0c21dc2378c932371bd407f1f72f4adbbaefcbf35ec568f6b0cf62ae653209 not found: ID does not exist" Jan 27 12:15:13 crc kubenswrapper[4775]: I0127 12:15:13.774559 4775 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbff1de5-dd70-4733-8a6f-8538d9940aee" path="/var/lib/kubelet/pods/cbff1de5-dd70-4733-8a6f-8538d9940aee/volumes" Jan 27 12:15:42 crc kubenswrapper[4775]: I0127 12:15:42.605724 4775 scope.go:117] "RemoveContainer" containerID="baa01a4c6fe93fc697e5252cef256367e24ac68983a3bf4c9c9429de1629fe05" Jan 27 12:15:42 crc kubenswrapper[4775]: I0127 12:15:42.631672 4775 scope.go:117] "RemoveContainer" containerID="0f3580828c538a1fd2620d795cca4ebbc4512c90dd73f2436a5638637886ada1" Jan 27 12:17:29 crc kubenswrapper[4775]: I0127 12:17:29.517277 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:17:29 crc kubenswrapper[4775]: I0127 12:17:29.517958 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 12:17:59 crc kubenswrapper[4775]: I0127 12:17:59.517882 4775 patch_prober.go:28] interesting pod/machine-config-daemon-qn99x container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 12:17:59 crc kubenswrapper[4775]: I0127 12:17:59.518637 4775 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qn99x" podUID="7707cf23-0a23-4f57-8184-f7a4f7587aa2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"